UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()

When using mix precison, i am getting this warning. Without mix precision, there is no such warning

UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()
scaler = torch.cuda.amp.GradScaler()
with experiment.train():
for batch_idx, _data in enumerate(train_loader):
    image, labels= _data 
    image, labels = image.to(device), labels.to(device)
    optimizer.zero_grad()
    with torch.cuda.amp.autocast():
        output = model(image)
        loss = criterion(output, labels)
    scaler.scale(loss).backward()
    scaler.step(optimizer)
    scaler.update()
    scheduler.step()

This is how i defined the learning rate


scheduler = optim.lr_scheduler.OneCycleLR(optimizer, max_lr=hparams['learning_rate'], 
             steps_per_epoch=int(len(train_loader)),
              epochs=hparams['epochs'],
               anneal_strategy='linear')

Here is warning i am getting

/usr/local/lib/python3.7/dist-packages/torch/optim/lr_scheduler.py:1565: UserWarning: To get the last learning rate computed by the scheduler, please use `get_last_lr()`.
  "please use `get_last_lr()`.", UserWarning)
/usr/local/lib/python3.7/dist-packages/torch/optim/lr_scheduler.py:134: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`.  Failure to do this will result in PyTorch skipping the first value of the learning rate scheduler

Look at this one: