Decreasing Maximum learning rate after every restart

Hi guys! I was using a scheduler to decrease the learning rate of my optimizer gradually and the one I was using was CosineAnnealingWarmRestarts().It has a maximum learning rate which is set by us and anneals the learning rate in a cosine curve manner until it hits a restart where the learning rate is set to maximum again and the cycle restarts. What I wanted to do is decrease the learning rate the optimizer is initialized with every restart. Is that possible in Pytorch?

You could try to manipulate the scheduler.base_lrs using this code:

model = nn.Linear(1, 1)
optimizer = torch.optim.SGD(model.parameters(), lr=1.)

scheduler = torch.optim.lr_scheduler.CosineAnnealingWarmRestarts(
    optimizer, T_0=10)

for epoch in range(40):
    print('scheduler ', scheduler.get_last_lr())
    print('optimizer ', optimizer.param_groups[0]['lr'])
    if (epoch+1) % 10 == 0:
        print('decrease')
        scheduler.base_lrs[0] = scheduler.base_lrs[0] * 0.5
        
    scheduler.step()

A quick test shows, that the optimizer’s learning rate for the default parameter group is also changed.
However, I haven’t tested the code, so you should double check, if the training is working as expected.

1 Like

I have tested it and it is working as expected. The only thing I had to add extra was to add an additional condition to make sure it doesn’t go below the min_LR.

Thanks @ptrblck :smiling_face:

1 Like