Change Learning rate during training with custom values

Hi all,

I am wondering if there is a way to set the learning rate each epoch to a custom value.

for instance in Matconvent you can specify learning rate as LR_SCHEDULE = np.logspace(-3, -5, 120) to have it change from .001 to .00001 over 120 training epochs, for instance.

is there something similar I can do in Pytorch?

my first idea is to define the following function and then re-define the optimizer each epoch

def scheduler(optimizer,lr):
    for param_group in optimizer.param_groups:
        param_group['lr'] = lr
    return optimizer

so then

for epoch in range(EPOCHS):
        lr = LR_SCHEDULE[epoch]
        optimizer = scheduler(optimizer,lr)

could this work?

thanks

2 Likes

torch.optim.lr_scheduler is basically doing the same update step as your scheduler code (besides some other checks).
Have a look at the implemented lr_schedulers to avoid rewriting them.

3 Likes

Thank you, I am aware of the torch.optim.lr_scheduler, I was more looking for something I can customize if needed instead of using the implemented versions

1 Like

torch.optim.lr_scheduler seems to adjust LR in a “relative” fashion, seeing that its method get_lr() do not take any argument. However, in most of my cases, I wish to do in an “absolute” fashion that set LR given current epoch and current iteration (yes I adjust the LR each iteration). In this case, such a wheel mentioned above is actually doing great for me.