Is it necessary to load optimizer?

When I torch.load(pretrained_model). Is it necessary to load optimizer under I set a proper learning rate? Thanks for your reply.

If you would like to continue the training and stored the state_dicts of the model and optimizer, both should be reloaded to continue the training properly.