CyclicLR scheduler

hi guys! i’m new to pytorch and was wondering how does a Scheduler work with an Optimizer? To my understanding, the scheduler changes the LR of the optimizer right? say a SGD optimizer has a learning rate of 0.01 and i have my CyclicLR scheduler with a base of 1e-6 and max of 0.006, how does the learning rate of the optimizer change? I am confused as to how this works when the LR of the optimizer is not in the range of the base/max LR of the scheduler.

or must the LR of the optimizer be in the range of the base LR and max LR of the cyclicLR? thank you in advance

The scheduler manipulates and thus defines the learning rate of the passed optimizer as seen here:

model = nn.Linear(3*224*224, 10)
optimizer = optim.SGD(model.parameters(), lr=1.)
print(optimizer.param_groups[0]['lr'])
scheduler = optim.lr_scheduler.CyclicLR(
    optimizer, base_lr=1e-3, max_lr=1e-2, step_size_up=5)


for epoch in range(10):
    print('Epoch {}, optim.lr {}, scheduler.lr {}'.format(
        epoch, optimizer.param_groups[0]['lr'], scheduler.get_last_lr()))
    optimizer.step()
    scheduler.step()