How to use torch.optim.lr_scheduler.ExponentialLR?

Use scheduler.step() instead of scheduler.step(epoch). It has strange behavious when using MultiStepLR. Though it works fine for StepLR in your example.

Ref: https://discuss.pytorch.org/t/whats-the-difference-between-scheduler-step-and-scheduler-step-epoch/73054

1 Like