How can I decrease learning rate linearly after each epoch in pytorch

Hello Everyone,
I Want to decrease learning rate linearly after each epoch during training.
Can I manually set this on every epoch by running torch.optim.SGD with new learning rate value?
If anyone has a solution or instruction about this issue, please answer here
Thank you all.

1 Like

yep you can do that as following :

for param_group in optimizer.param_groups:
param_group[‘lr’] = lre

just set lre as the value you want so I guess previous

lre = lre - (maximum - minimum /nb_epochs)

2 Likes

you can use one of the learning rate schedulers to do that. I think StepLR is what you are looking for.