This error will be raised, if you’ve stored the state_dict
from the nn.DataParallel
model and are now trying to load it to a plain model.
You could use this approach to remove the .module
from the keys or alternatively store the state_dict
via:
sd = model.module.state_dict()
torch.save(sd, PATH)