Learning rate scheduling at each step

Hi everyone,
I want to modify the value of my learning rate at each step instead of doing it at the end of each epoch.
In particular at each step of my training I compute a generic measure x and I want to modify the learning rate for the next step as a function of x.
Just to give a more explicit idea of the pipeline, given the constant C what I want to do is something like:

for epoch in range (0, n_epochs):
    for data, label in train_loader:
        #training
        #computation of x
        #modify of the learning rate as a function of x (for example lr = C*(1+x))

Till now I’ve only used lr_schedule = torch.optim.lr_scheduler.LambdaLR(optimizer, lr_lambda = lr_rule) to modify the learning rate value.
Is there a more flexible method in pytorch to modify it at each step and with a generic function of a generic computed measure?

I’m not sure if there is a more flexible class for your use case, but I guess you might skip using the scheduler and update the learning rate manually as you wish via:

for param_group in self.optimizer.param_groups:
    param_group['lr'] = lr
2 Likes