CyclicLR always between 1e-4 and 1e-2

I am trying to use the Cyclic on my model. I changed the optimizer from Adam to SGD due to (if I understood well) a bug in the last stable version of pytorch.
The problem now is that no matter what base_lr and max_lr I take to initiate the scheduler, the learning rate fluctuates between 1e-4 and 1e-2.

Could you post a code snippet showing how you are initializing the optimizer and lr_scheduler so that we can have a look?

opti = torch.optim.SGD(self.model_.parameters(),, momentum=0.9)
scheduler = CyclicLR(optimizer,, 0.01, step_size_up=10)

With = 0.0001.