Set New LR dynamically

I want to set LR during training cycle based on the validation loss.
Hybrid of One cycle Scheduling and Fixed LR.
Whenever i see there is no improvement in Validation Metric beyond certain LR,i want to stop making a step for scheduler and just put preferred LR at that point.

Let me know how can we achieve this

Currently I am doing like this

if scheduler.get_last_lr()[0]>3e-3 or epoch < int(0.35*48):
            
            scheduler.step()
        else:
            print('Not taking step lr at ',scheduler.get_last_lr()[0])

I use above method currently.but the problem is if LR is just close to 3e-3 then taking a step reduces to the LR at which there is no improvement in Val metric.
So i just want to set LR to the point till where was metric improving last.