For my neural network, I am trying to vary the learning rate using two different approaches - LRscheduler.stepLR and ReduceLROnPlateau. I have tried multiple values for step_size and gamma for LRscheduler.stepLR and factor and patience for ReduceLROnPlateau, but not getting good results compared to a constant learning rate.
scheduler = optim.lr_scheduler.StepLR(optimizer, step_size=100, gamma=0.1) scheduler = ReduceLROnPlateau(optimizer, mode='min', factor=0.05, patience=20, verbose=True)
But the common practice is to schedule the learning rate as it is believed to give better results. So, I wanted to ask is there a common best practice for choosing these values based on the number of epochs or initial learning rate, or some other stuff?