Hello everyone!

I have a question related to the function torch.optim.lr_scheduler.CyclicLR. There is a parameter called step_size_up. During the training process, I only set the parameter step_size_up and did not set step_size_down. However, I found that during training, both learning rate increasing and learning rate decreasing exist.

What I understand is that if I only set the parameter step_size_up, there should be not learning rate decreasing.

Could you explain why learning rate decreasing exists?