StepLR learing rate calculation error when step_size = 1

I run the following code:

import torch
import torchvision

net = torchvision.models.resnet18()
optimizer = torch.optim.SGD(net.parameters(), lr=1)
lr_sche = torch.optim.lr_scheduler.StepLR(optimizer, step_size=1, gamma=0.1)
data = torch.rand(1, 3, 224, 224)
for epoch in range(5):
        optimizer.step()
        print('Epoch', epoch)
        print(lr_sche.get_lr())
        lr_sche.step()
        print(lr_sche.get_lr())

and get the output:

Epoch 0
[1]
[0.010000000000000002]
Epoch 1
[0.010000000000000002]
[0.0010000000000000002]
Epoch 2
[0.0010000000000000002]
[0.00010000000000000003]
Epoch 3
[0.00010000000000000003]
[1.0000000000000004e-05]
Epoch 4
[1.0000000000000004e-05]
[1.0000000000000004e-06]

I am confused why the “lr” jumped directly from 1 to 0.01? Is this a bug or I used the StepLR in the wrong way?

You should get a warning stating that you should use get_last_lr() instead of get_lr():

UserWarning: To get the last learning rate computed by the scheduler, please use get_last_lr().

This should give you the expected learning rates:

for epoch in range(5):
        optimizer.step()
        print('Epoch', epoch)
        print(lr_sche.get_last_lr())
        lr_sche.step()
        print(lr_sche.get_last_lr())

> Epoch 0
[1]
[0.1]
Epoch 1
[0.1]
[0.010000000000000002]
Epoch 2
[0.010000000000000002]
[0.0010000000000000002]
Epoch 3
[0.0010000000000000002]
[0.00010000000000000003]
Epoch 4
[0.00010000000000000003]
[1.0000000000000004e-05]
1 Like