Load a saved model which is saved by torch.save([model, criterion, optimizer], f)

Was the original model saved with DataParallel activated? If so, that explains the module prefix occuring in your state_dict. See my previous answer here for details on how to fix this (in both saving and loading situations).

1 Like