Lr schedule print learning rate only when changing it

when setting verbose=True, the message ‘adjusting learning rate…’ is printed every time the command

schedule.step()

is called.

i want to modify that so only when there is an actual change in lr, it will print the message.

i looked in the source code and found this commad ‘print_lr’ which belongs to the base class i think.
i don’t understand how can i call it from my own code, say everytime the scheduler actually updates.

what i tried so far:
to manually print results in error

self.schedule.print_lr(True,0,self.schedule.get_last_lr(),0)
prints:
line 113, in print_lr
    ' of group {} to {:.4e}.'.format(group, lr))
TypeError: unsupported format string passed to list.__format__

would appreciate any advice!

Hi, what exact scheduler are you using?

step_lr
i call schedule.step() every batch and update it every 1000 batches.

Yes. I can see this unexpected behavior of printing out even then not really adjusting learning rate. First, you can decrease the amount of printing by doing steps only after every epoch, as this type of scheduler working on per epoch basis.

that would work but i need to decrease lr every 1000 batches because every epoch is very big.

I believe the working solution would be to make it verbose=False and log learning rate from optimizer every desirable logging_step.
Something like this:

logging_step = 1000
for step, batch in enumerate(dataloader):
    # some working logic here
    ...
    if step % logging_step == 0:
        print(f"{group['lr'] for group in optimizer.param_groups}")



Other way. You can copy/paste step lr scheduler code from pytorch to your project and add redefined print_lr function
Something like this should work:

def print_lr(self, is_verbose, group, lr, epoch=None):
    """Display the current learning rate.
    """
    if is_verbose and ((self._step_count - 1) % self.step_size == 0):
        if epoch is None:
            print(self._step_count)
            print('Adjusting learning rate'
                    ' of group {} to {:.4e}.'.format(group, lr))
        else:
            print('Epoch {:5d}: adjusting learning rate'
                    ' of group {} to {:.4e}.'.format(epoch, group, lr))
1 Like

hey alexey,
thank you for the advice.
i went with the first solution of just manually printing the current lr and setting the print to run at the same time i update the lr.

the solution i had in mind was more like setting verbose false and manually calling the print_lr function, but i guess the first solution is not so bad.

i do think it might be useful to add a 3rd option to verbose - print only when updating lr values.

I agree with your logic on that. I can as well understand the logic of pythorch team decision. I can think of it as of some consistent debugging interface then if you set verbose=True, you will see the learning rate at each scheduler step for every scheduler you decided to choose.