Save loss parameter in checkpoint

Is it necessary to save loss while saving checkpoint?
Is it an important parameter?

If I save checkpoint with model.state_dict(), optimizer.state_dict() and scheduler.state_dict(), when I restart training from checkpoint, I see first epoch has different loss value

loss isn’t a parameter but it is computed, so you don’t need to save it except for informational purposes.
It would be expected that the loss is different after loading and running the model again, just like it would be if you had run the original model right after saving it.

Best regards

Thomas

Thanks @Tom for the detailed answer

What are the important parameters do we need to save in checkpoint?