Model loaded for inference performs like a random init

It shouldn’t raise a warning because I saved a model.state_dict() while model was wrapped in nn.DataParallel(), and then loading it by first wrapping an initialised model in DataParallel and then loading the state dict into that. But that didn’t work - hence my original post. Once I changed to saving the state dict explicitly (not wrapped in the module) and loading it into model.module.state_dict(), then it all worked.

@pattiJane , see above how I worked it out.