How to use once Cycle LR properly

I have 288 images, and i am using batch size of 4. So I think my steps_per_epoch is 288//4.

I have plotted LR as follow

import matplotlib.pyplot as plt
opt = torch.optim.Adam(model.parameters(), lr=1e-4,weight_decay=1e-4)
scheduler = torch.optim.lr_scheduler.OneCycleLR(optimizer,pct_start=0.33, max_lr=1e-3,steps_per_epoch=288//4, epochs=10,last_epoch =-1)
lrs = []
for i in range(10):
    optimizer.step()
    lrs.append(optimizer.param_groups[0]["lr"])
    scheduler.step()
plt.plot(lrs)

I am getting as follow


but this graph is not showing the lowering of the learning rate

You could see the changing behavior by iterating the scheduler for 720 steps, since you’ve specified steps_per_epoch=72 and epochs=10.
From the docs:

epochs (int): The number of epochs to train for. This is used along with steps_per_epoch in order to infer the total number of steps in the cycle if a value for total_steps is not provided.