Weird warning popping out regarding learning rate scheduler

I’m getting this warning only after first epoch

/usr/local/lib/python3.6/dist-packages/torch/optim/ UserWarning: Please also save or load the state of the optimizer when saving or loading the scheduler.
warnings.warn(SAVE_STATE_WARNING, UserWarning)

I’m unable to understand this. What does this mean?

Can you please show us a snippet of your code which you’re running?

@fadetoblack Yeah sure.
Here is my model and dataset file

And here is how i run it

model = ModelOne(config)
data = DataTypeOne(config)

trainer = pl.Trainer(gpus=1, max_epochs=config.epoch), data)

I had a similar discussion on huggingface: github issue

This happens when optimizer state dict is accessed. See here: pytorch repo

Seems that we can safely ignores that.