Weird warning popping out regarding learning rate scheduler

I’m getting this warning only after first epoch

/usr/local/lib/python3.6/dist-packages/torch/optim/lr_scheduler.py:200: UserWarning: Please also save or load the state of the optimizer when saving or loading the scheduler.
warnings.warn(SAVE_STATE_WARNING, UserWarning)

I’m unable to understand this. What does this mean?

Can you please show us a snippet of your code which you’re running?

@fadetoblack Yeah sure.
Here is my model and dataset file https://gist.github.com/saahiluppal/e1ffeac4c6c6c3da045cf07d8f8df37e

And here is how i run it

model = ModelOne(config)
data = DataTypeOne(config)

trainer = pl.Trainer(gpus=1, max_epochs=config.epoch)
trainer.fit(model, data)

I had a similar discussion on huggingface: github issue

This happens when optimizer state dict is accessed. See here: pytorch repo

Seems that we can safely ignores that.
Thanks