What is the validation loss mention in the lr_scheduler doc

Hello !

I hope I am not asking something dumb, but I could not find a clear answer despite a bit of googling so I’ll try my luck here :

When reading the doc about learning rate scheduling in pytorch, I see that the scheduler has to be provided with a validation error value. Unfortunately it is not describe anywhere what this corresponds to nor how to compute it so I don’t really know how to call scheduler.step() properly …

Can someone help me ?

Thanks in advance !

Some schedulers, e.g. ReduceLROnPlateau expect to track a metric in order to lower the learning rate.
Usually you would split your dataset into training, validation (, and test) and use the validation loss to trigger the learning rate reduction.
The idea behind it is that once your validation loss doesn’t decrease anymore (or not significantly), lowering the learning rate might help decreasing it further.

However, you could use whatever metric you like to use. :wink:

PS: other learning rate schedulers do not expect any input and use some other schedule, e.g. epochs as milestones.