Detect when scheduler changes the learning rate

Hi,

I need to divide my learning rate and change my loss when my model reaches a plateau.
I’m using ReduceLROnPlateau for the learning rate part, and I would like to change the loss when the learning rate gets updated.
Is there any way to know from inside the script when the learning rate is being updated?

Hi,

you can at least check the learning rate with

optimizer.param_groups[0][‘lr’]

but I don’t know if there is a more efficient way with a callback function.

Hi,

Thank you for your answer.
I have switched from ReduceLROnPlateau to editing the learning rate by doing:

for g in optimizer.param_groups:
    g["lr"] = new_learning_rate

and detecting a plateau in the val loss myself, but I’m missing some features of ReduceLROnPlateau like threshold and threshold_mode.

Maybe the easiest would be to copy the code for ReduceLROnPlateau and call the function changing my loss at the end of their _reduce_lr function.

Looking into it, there isn’t any simple way to access the last learning rate because ReduceLROnPlateau inherits from object contrary to the other schedulers inheriting from _LRScheduler.
Is there a reason behind this?