How to implement torch.optim.lr_scheduler.CosineAnnealingLR?

Would you like to lower the learning rate to its minimum in each epoch and then restart from the base learning rate?
If so, you could try the following code:

model = nn.Linear(10, 2)
optimizer = optim.SGD(model.parameters(), lr=1.)
steps = 10
scheduler = optim.lr_scheduler.CosineAnnealingLR(optimizer, steps)

for epoch in range(5):
    for idx in range(steps):
        scheduler.step()
        print(scheduler.get_lr())
    
    print('Reset scheduler')
    scheduler = optim.lr_scheduler.CosineAnnealingLR(optimizer, steps)

Note that the steps loop is basically your DataLoader loop.

33 Likes