Using Learning Rate scheduler with Trainer


I’m currently using Trainer for training models. But I would like to control learning rate over whole training. Can I use lr_scheduler with trainer? I can’t find any information how to use both

Could you give more information about the Trainer and where this class is coming from?
Assuming you are using a higher-level API, I guess its documentation would point to the usage of a learning rate scheduler. If you’ve written the Trainer yourself, could you post its code, please?