How to use print_lr in the lr_scheduler?

Im trying to use the build in function for printing the lr in my schedule

scheduler = StepLR(optimizer, step_size=3, gamma=0.1)

I see that I can use print_lr(is_verbose, group, lr, epoch=None) to see the lr? but what every I do it shows the same thing, should not it be different for diferent epoch?
e.g.
I tried:
scheduler.print_lr(True,optimizer,args.lr,epoch=100)

and

scheduler.print_lr(True,optimizer,args.lr,epoch=10)

and both gives similar output?

I don’t think this method is supposed to be exposed, as it’s just printing the input.
Based on this code:

    def print_lr(self, is_verbose, group, lr, epoch=None):
        """Display the current learning rate.
        """
        if is_verbose:
            if epoch is None:
                print('Adjusting learning rate'
                      ' of group {} to {:.4e}.'.format(group, lr))
            else:
                print('Epoch {:5d}: adjusting learning rate'
                      ' of group {} to {:.4e}.'.format(epoch, group, lr))

it seems to be a convenient method to print the lr updates in case you are using the verbose option.

I think you should use scheduler.get_last_lr() instead.

1 Like