I need to divide my learning rate and change my loss when my model reaches a plateau.
I’m using ReduceLROnPlateau for the learning rate part, and I would like to change the loss when the learning rate gets updated.
Is there any way to know from inside the script when the learning rate is being updated?
Looking into it, there isn’t any simple way to access the last learning rate because ReduceLROnPlateau inherits from object contrary to the other schedulers inheriting from _LRScheduler.
Is there a reason behind this?