ReduceLROnPlateau is not decreasing the learing rate

Hi,
In my code, I want to reduce the learning rate when my monitor matrix (loss_mean) attained minimum values but ReduceLROnPlateau is not decreasing the learning rate, it is constant throughout the training.
the below code is showing how I realize ReduceLROnPlateau in my code. The value of my monitor tensor, loss_mean are:
-193.92658996582
-193.953338623047
-193.979461669922
-193.974502563477
-193.963928222656
-193.976669311523
-193.98210144043
-193.942535400391
-193.969467163086
-193.977722167969
-193.976165771484
-193.969940185547
-193.953491210938
-193.986679077148
-193.95051574707
-193.958099365234
-193.962783813477
-193.970840454102
-193.988967895508
-193.992889404297
(the last few epochs)

scheduler={
“scheduler”: ReduceLROnPlateau(
optimizer=optimizer,mode=“min” ,factor=0.92, patience=50, threshold=1e-2 ,min_lr=1e-6),“monitor”:“loss_mean”}.

Please have a look and help me.

The patience value is set to 50 steps which seems to be high considering:

patience (int) – Number of epochs with no improvement after which learning rate will be reduced. For example, if patience = 2, then we will ignore the first 2 epochs with no improvement, and will only decrease the LR after the 3rd epoch if the loss still hasn’t improved then. Default: 10.

By default threshold_mode will also use a relative comparison and from the few values you have posted the loss decreases after a few steps again, so you might want to adapt the settings a bit.

Hi,
I reduces the patience to 5 and learning rate starts decreasing.Can you tell me what does the threshold and threshold_mode does with the example (using the monitor metric that i mentioned) ? I have read the docs and all the available materials but could not understand it. Please explain me these If possible simply.