Setting lower bound on learning rate in torch.optim.lr_scheduler.StepLR

Hi!

I was trying to find if I can set lower bound on learning rate in torch.optim.lr_scheduler. StepLR so that it will not reduce more on step().

I was wondering if I could do that by setting a value in argument. I didn’t find available argument for that. Would I have to do it by manually checking params[‘lr’]?

Many thanks.