Scheduler.step() after each epoch or after each minibatch

Hi, I defined a exp_lr_scheduler like

exp_lr_scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=40, gamma=0.1)

But was wondering what is the best way to use it? After each epoch or after each minibacth
USE CASE 1.

for epoch in range(num_epoch):
  scheduler.step()
  for img, labels in train_loader:
    .....
    optimizer.zero_grad()
    optimizer.step()

This one give this warning

UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`.  
In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()

USE CASE 2

for epoch in range(num_epoch):
  for img, labels in train_loader:
    .....
    optimizer.zero_grad()
    optimizer.step()
  # At the end of the epoch
  scheduler.step()

This way, no warning, since the optimizer.step() is called before scheduler.step(). But scheduler.step() still run for once per epoch
USE CASE 3

for epoch in range(num_epoch):
  for img, labels in train_loader:
    .....
    scheduler.step()  # run for each minibach
    optimizer.zero_grad()
    optimizer.step()

Here, scheduler.step() runs for each minibatch.
What do you think is the right way of using such scheduler.
Thank you.

2 Likes

Use case #2 is the best. That is the one recommended by the pytorch docs. You could use 3 if you set the stepsize really high but usually doing it in minibatches will lower the learning rate too fast.

1 Like