I need to divide my learning rate and change my loss when my model reaches a plateau.
I’m using ReduceLROnPlateau for the learning rate part, and I would like to change the loss when the learning rate gets updated.
Is there any way to know from inside the script when the learning rate is being updated?
you can at least check the learning rate with
but I don’t know if there is a more efficient way with a callback function.
Thank you for your answer.
I have switched from ReduceLROnPlateau to editing the learning rate by doing:
for g in optimizer.param_groups:
g["lr"] = new_learning_rate
and detecting a plateau in the val loss myself, but I’m missing some features of ReduceLROnPlateau like
Maybe the easiest would be to copy the code for
ReduceLROnPlateau and call the function changing my loss at the end of their
Looking into it, there isn’t any simple way to access the last learning rate because ReduceLROnPlateau inherits from
object contrary to the other schedulers inheriting from
Is there a reason behind this?