Hi, I’m training my network with Pytorch. And I’ve met a curious problem.
See, I’m going to use StepLR as my lr changing rule, and I expect that lr=0.0002 when epoch < 30 and lr = 0.00002 while 30< epoch < 40(the entire epoch is 40). So I set the function as follow, my optimizer is Adam:

Second, I should put the sheduler.step() ahead of optimizer.step()?

Emm I’ve put the sheduler.step() after finishing each epoch.
What I’ve written is as below

for epoch in range(40):
print(epoch, 'lr={:6f}'.format(scheduler.get_lr()[0])
for i, image in enumerate(trainLoader):
...
optimizer.step()
if i % 200 == 0 and i !=0:
print("Training: Epoch[{:0>3}/{0>3}] Iteration[{:0>3}/{0>3}] Loss: {:.4f} lr={:.8f}".format(epoch + 1, iterations, i + 1, len(trainLoader), loss_avg, sheduler.get_lr()[0]))
...
sheduler.step()

And I should modify it as this?

for epoch in range(40):
sheduler.step()
print(epoch, 'lr={:6f}'.format(scheduler.get_last_lr()[0])
for i, image in enumerate(trainLoader):
...
optimizer.step()
if i % 200 == 0 and i !=0:
print("Training: Epoch[{:0>3}/{0>3}] Iteration[{:0>3}/{0>3}] Loss: {:.4f} lr={:.8f}".format(epoch + 1, iterations, i + 1, len(trainLoader), loss_avg, sheduler.get_last_lr()[0]))