Lr_scueduler explanation?

In the Pytorch transfer learning tutorial, the following line of code appears:

scheduler = lr_scheduler.StepLR(optimizer_ft, step_size=7, gamma=0.1)

and during the model training:


I can’t find the documentation of lr_scheduler (only the method itself). Will someone please help me understand what exactly does these lines do and what (in general) does the lr_scheduler is inteneded to do?


The learning rate scheduler adjusts the learning rates stored in the optimizer.
In your example a step function is used to lower the learning rate after 7 steps with a multiplicative factor of gamma=0.1.
Assuming that your initial learning rate is 1e-1:

optimizer = optim.SGD(model.parameters(), lr=1e-1)

Now after scheduler.step() is called 7 times, the learning rate will be lowered to
lr = 1e-1 * gamma = 1e-2.

This schedule looks like a step functions, thus the name.
There are also other functions, like ExponentialLR or CosineAnnealingLR.

You can find all schedulers here.

1 Like

Many thanks ptrblck! Your explanation was very helpful!