Resume training with LR scheduler

Hi all,
I’m trying to resume training on a model with a stepLR scheduler, and I keep getting the warning from the following link:
https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate

UserWarning: Detected call of 'lr_scheduler.step()' before 'optimizer.step()'

Basically, the issue is that I’ve saved the last recorded step (epoch) along with my model and optimizer’s state dicts. Thus, I if i’ve already trained for x amount of epochs, I would call lr_scheduler.step() x amount of times before passing everything to the trainer.

So is this warning something I should worry about in my case? The warning itself isn’t informative enough for me to understand what the real issue is (maybe this is only problematic during training?). I do see that, by comparing with a previous run, my training isn’t being reproduced 100%, but it’s not off by too much.

Thanks!

I think you can ignore the warning, as you are calling this method before the training to get to the same epoch value.
The warning should be considered, if you are seeing it inside your training loop.

1 Like