Hello everyone,
I’m facing a strange thing that I would like to know what is behind it.
in my model training loop
if phase == 'train':
scheduler.step() # <-------
model.train()
...
optimizer.step()
when I place the scheduler before the optimizer I get an error
UserWarning: Detected call of
lr_scheduler.step()
beforeoptimizer.step()
. In PyTorch 1.1.0 and later, you should call them in the opposite order:optimizer.step()
beforelr_scheduler.step()
. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule.
but the error rate is much better which means the scheduler is working fine,
when I fix it and put the scheduler behind the optimizer as given in the error message, the error is far bigger and it doesn’t decrease.
if phase == 'train':
model.train()
...
optimizer.step()
scheduler.step() # <-------
Is there is an explanation for why following the error message recommendation is bad for the training?