Hyperparameter in Checkpoint

Is it required or necessary to save hyperparameters in the checkpoint?

The recommended way of serializing a PyTorch model is to store its state_dict, which contains all parameters and buffers. While this should be enough to recreate the model, load the state_dict, and create new predictions, you might want to e.g. continue the training of this model.
In this case, it might be useful to store some hyperparameters (which are not already included in e.g. the optimizer’s state_dict) to properly restore the training routine.

If we save the following dict in a checkpoint, I believe we save all parameters and hyperparameters.
Is it correct?

model.state_dict()
optimizer.state_dict()
scheduler.state_dict()