hi guys! i’m new to pytorch and was wondering how does a Scheduler work with an Optimizer? To my understanding, the scheduler changes the LR of the optimizer right? say a SGD optimizer has a learning rate of 0.01 and i have my CyclicLR scheduler with a base of 1e-6 and max of 0.006, how does the learning rate of the optimizer change? I am confused as to how this works when the LR of the optimizer is not in the range of the base/max LR of the scheduler.
or must the LR of the optimizer be in the range of the base LR and max LR of the cyclicLR? thank you in advance