I’m using 1.4.0. What’s the difference between scheduler.step() and scheduler.step(epoch)? What does
scheduler.step(epoch) do?
import torch
optimizer1 = torch.optim.SGD([torch.randn(1, requires_grad=True)], lr=1e-3)
exp_lr_scheduler1 = torch.optim.lr_scheduler.MultiStepLR(optimizer1,
milestones=[5, 10], gamma=0.1)
optimizer2 = torch.optim.SGD([torch.randn(1, requires_grad=True)], lr=1e-3)
exp_lr_scheduler2 = torch.optim.lr_scheduler.MultiStepLR(optimizer2,
milestones=[5, 10], gamma=0.1)
for epoch in range(1, 15):
exp_lr_scheduler1.step()
exp_lr_scheduler2.step(epoch)
print('Epoch {}, lr1 {}, lr2 {}'.format(epoch,
optimizer1.param_groups[0]['lr'],
optimizer2.param_groups[0]['lr']))
The result is like follows:
Epoch 1, lr1 0.001, lr2 1.0000000000000003e-05
Epoch 2, lr1 0.001, lr2 1.0000000000000003e-05
Epoch 3, lr1 0.001, lr2 1.0000000000000003e-05
Epoch 4, lr1 0.001, lr2 1.0000000000000003e-05
Epoch 5, lr1 0.0001, lr2 1.0000000000000003e-05
Epoch 6, lr1 0.0001, lr2 1.0000000000000003e-05
Epoch 7, lr1 0.0001, lr2 1.0000000000000003e-05
Epoch 8, lr1 0.0001, lr2 1.0000000000000003e-05
Epoch 9, lr1 0.0001, lr2 1.0000000000000003e-05
Epoch 10, lr1 1e-05, lr2 1.0000000000000003e-05
Epoch 11, lr1 1e-05, lr2 1.0000000000000003e-05
Epoch 12, lr1 1e-05, lr2 1.0000000000000003e-05
Epoch 13, lr1 1e-05, lr2 1.0000000000000003e-05
Epoch 14, lr1 1e-05, lr2 1.0000000000000003e-05
They have different behavious when using MultiStepLR
.