Learning rate scheduling at each step

I’m not sure if there is a more flexible class for your use case, but I guess you might skip using the scheduler and update the learning rate manually as you wish via:

for param_group in self.optimizer.param_groups:
    param_group['lr'] = lr
2 Likes