I’m using adam for optimization. Should I change learning rate using this?
for param_group in optimizer.param_groups:
param_group['lr'] = lr
It seems every time I change the learning rate, the loss increases a lot, and the accuracy goes down at the learning rate transition point. What’s the reason?
That is the correct way to manually change a learning rate and it’s fine to use it with Adam. As for the reason your loss increases when you change it. We can’t even guess without knowing how you’re changing the learning rate (increase or decrease), if that’s the training or validation loss/accuracy, and details about the problem you’re solving. The reasons could be anything from “you’re choosing the wrong learning rate” to “Your optimization jumped out of a local minimum”.
It’s likely best to get more intuition as to what’s happening with the optimization on your own if you’re interested.