Confusion with LR Scheduler step() In Pytorch 1.1.0 and Later

Today I upgraded pytorch from 1.1.0 to 1.2.0 for some reasons. But when running the same code on pytorch 1.2.0 I get a warning as follows:

/usr/local/lib/python3.5/dist-packages/torch/optim/lr_scheduler.py:82: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule.See more details at torch.optim — PyTorch 2.1 documentation

What is the first value of the lr schedule? Is that means the scheduler will start the lr from 0.0005 for the first epoch if I set lr=0.001 and gamma=0.5 and call scheduler.step() before optimizer.step()?

BUT I printed the optimizer and scheduler.get_lr() every iteration and found both of the calling order (i.e. scheduler.step()+optimizer.step() and optimizer.step()+scheduler.step()) start the lr from 0.001 for the first epoch.

Is there anything I misunderstand and what’s going on here?

Thanks a lot.

2 Likes