If we load a saved checkpoint to a model, but the parameters in the model do not exactly correspond to the checkpoint’s parameters, we can use a strict=False
flag to allow for this behavior. However, in the case of optimizer I don’t see such a flag, and and loading the save optimizer state_dict
results in an error. How can I load the optimizer state_dict when the parameters are different?
model.load_state_dict(checkpoint['model'], strict=False)
optimizer.load_state_dict(checkpoint['optimizer'])
ValueError: loaded state dict contains a parameter group that doesn't match the size of optimizer's group