I see that I can use print_lr(is_verbose, group, lr, epoch=None) to see the lr? but what every I do it shows the same thing, should not it be different for diferent epoch?
e.g.
I tried: scheduler.print_lr(True,optimizer,args.lr,epoch=100)
I don’t think this method is supposed to be exposed, as it’s just printing the input.
Based on this code:
def print_lr(self, is_verbose, group, lr, epoch=None):
"""Display the current learning rate.
"""
if is_verbose:
if epoch is None:
print('Adjusting learning rate'
' of group {} to {:.4e}.'.format(group, lr))
else:
print('Epoch {:5d}: adjusting learning rate'
' of group {} to {:.4e}.'.format(epoch, group, lr))
it seems to be a convenient method to print the lr updates in case you are using the verbose option.
I think you should use scheduler.get_last_lr() instead.