How can I get the current learning rate being used by my optimizer?
Many of the optimizers in the torch.optim class use variable learning rates. You can provide an initial one, but they should change depending on the data. I would like to be able to check the current rate being used at any given time.
This question is basically a duplicate of this one, but I don’t think that one was very satisfactorily answered. Using Adam, for example, when I print:
optimizer = torch.optim.Adam(model.parameters(), lr=1e-4)
'''...training...'''
for param_group in optimizer.param_groups:
print(param_group['lr'])
I always see the initial learning rate no matter how many epochs of data I run.
However, if I want to restart interrupted training progress or even just debug my loss, it makes sense to know where the optimizer left off.
I am guessing the reason you see the same lr is that the lr for adam has not changed. The effective lr which has components from moment estimate are different.