I want to train my network like so:
epoch 1 - 10 learning rate = 0.001
epoch 10 - 90 learning rate = 0.1
epoch 90 - 120 learning rate = 0.01
epoch 120 - 200 learning rate = 0.001
The reason why I need a small learning rate at the beginning is to kickstart the network into training. I tried using LambdaLR to define
lambda1 = lambda epoch: epoch * 100
lambda2 = lambda epoch: epoch * 0.1
but I cannot specify the milestones. On the other hand, MultiStepLR does not let you use multiple gammas. Is there a way around this ?