I’m trying to resume training on a model with a stepLR scheduler, and I keep getting the warning from the following link:
UserWarning: Detected call of 'lr_scheduler.step()' before 'optimizer.step()'
Basically, the issue is that I’ve saved the last recorded step (epoch) along with my model and optimizer’s state dicts. Thus, I if i’ve already trained for
x amount of epochs, I would call
x amount of times before passing everything to the trainer.
So is this warning something I should worry about in my case? The warning itself isn’t informative enough for me to understand what the real issue is (maybe this is only problematic during training?). I do see that, by comparing with a previous run, my training isn’t being reproduced 100%, but it’s not off by too much.