ReduceLROnPlateau is not decreasing the learing rate

In my code, I want to reduce the learning rate when my monitor matrix (loss_mean) attained minimum values but ReduceLROnPlateau is not decreasing the learning rate, it is constant throughout the training.
the below code is showing how I realize ReduceLROnPlateau in my code. The value of my monitor tensor, loss_mean are:
(the last few epochs)

“scheduler”: ReduceLROnPlateau(
optimizer=optimizer,mode=“min” ,factor=0.92, patience=50, threshold=1e-2 ,min_lr=1e-6),“monitor”:“loss_mean”}.

Please have a look and help me.

The patience value is set to 50 steps which seems to be high considering:

patience (int) – Number of epochs with no improvement after which learning rate will be reduced. For example, if patience = 2, then we will ignore the first 2 epochs with no improvement, and will only decrease the LR after the 3rd epoch if the loss still hasn’t improved then. Default: 10.

By default threshold_mode will also use a relative comparison and from the few values you have posted the loss decreases after a few steps again, so you might want to adapt the settings a bit.

I reduces the patience to 5 and learning rate starts decreasing.Can you tell me what does the threshold and threshold_mode does with the example (using the monitor metric that i mentioned) ? I have read the docs and all the available materials but could not understand it. Please explain me these If possible simply.