Changing learning rate after checkpoint

During training of Neural networks in PyTorch, I save a checkpoint with a learning rate 0.005 but when I started the training again from that checkpoint I changed the learning rate to 0.05. Does it affect the training?

Yes, increasing the learning rate will affect the training. E.g. if it’s too high it could “reset” the training again.

If i use a scheduler which reduces my learning rate in every 5 thousand epochs by a factor 0.2, for eg. if i start with lr=0.05, and then it becomes 0.01 then 0.002 and then 0.0004. Does it have affect on training?

Yes, all changes of the learning rate will affect the training.