I called `scheduler.step()` right after `optimizer.step()`, but why there was a warning said I might call them in an inverse order?

Here’s the code snippet:

from torch.optim.lr_scheduler import CosineAnnealingLR
from torch.optim import SGD
from torchvision.models import resnet18
from copy import deepcopy
from torch import randn

model = resnet18()
optimizer = SGD(model.parameters(), 0.01)
lr_scheduler = CosineAnnealingLR(optimizer, 10)
lr_schedulers = [deepcopy(lr_scheduler) for _ in range(0, 5)]
current_scheduler = None

for i in range(0, 60):
    if i % 10 == 0:
        current_scheduler = lr_schedulers[i // 10]
    x = randn(10, 3, 32, 32)
    loss = model(x).sum()
    loss.backward()
    optimizer.step()
    current_scheduler.step()

As can be seen in the snippet, scheduler.step() is called right after optimizer.step(), but I received a warning said

/root/miniconda3/envs/myconda/lib/python3.10/site-packages/torch/optim/lr_scheduler.py:129:
 UserWarning: Seems like `optimizer.step()` has been overridden after learning rate scheduler
 initialization. Please, make sure to call `optimizer.step()` before `lr_scheduler.step()`. 
See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
  warnings.warn("Seems like `optimizer.step()` has been overridden after learning rate scheduler "