Why there's a line `scheduler.step(0)` in the code of class `SequentialLR`?

I was using a SequentialLR to adjust my learning rate in the training process. However, a warning appeared:

UserWarning: The epoch parameter in `scheduler.step()` was not necessary
and is being deprecated where possible. Please use `scheduler.step()` to 
step the scheduler. During the deprecation, if epoch is different from None, 
the closed form is used instead of the new chainable form, where available. 
Please open an issue if you are unable to replicate your use case: 
https://github.com/pytorch/pytorch/issues/new/choose.
warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)

I checked my code and there was nothing like scheduler.step(epoch). After debugging, I found the problem was in torch\optim\lr_scheduler.py. The line scheduler.step(0) in SequentialLR.step caused the warning.

def step(self):
    self.last_epoch += 1
    idx = bisect_right(self._milestones, self.last_epoch)
    scheduler = self._schedulers[idx]
    if idx > 0 and self._milestones[idx - 1] == self.last_epoch:
        scheduler.step(0)
    else:
        scheduler.step()

    self._last_lr = scheduler.get_last_lr()

I wonder why there’s a line scheduler.step(0) in SequentialLR. After all, the epoch parameter should be deprecated.

1 Like

Hi!

Most likely an oversight on our end during the deprecation process. Could you please open an issue about this on github?

Sure, my pleasure. :smile: