Changing learning rate multiple times during training

I want to train my network like so:

epoch 1 - 10      learning rate = 0.001
epoch 10 - 90     learning rate = 0.1
epoch 90 - 120    learning rate = 0.01
epoch 120 - 200   learning rate = 0.001

The reason why I need a small learning rate at the beginning is to kickstart the network into training. I tried using LambdaLR to define

lambda1 = lambda epoch: epoch * 100
lambda2 = lambda epoch: epoch * 0.1

but I cannot specify the milestones. On the other hand, MultiStepLR does not let you use multiple gammas. Is there a way around this ?

1 Like

I can think of a dirty way to define lambda for LambdaLR:

lambda epoch: 0.001 if 1 <= epoch < 10 else 0.1 if 10 <= epoch < 90 else 0.01 if 90 <= epoch < 120 else 0.001

or alternatively, you can maybe create your own learning rate scheduler by inheriting the torch.optim._LRScheduler class.