I saved a dataparallel version of model using model.module.state_dict() but was able to recover the model as model.state_dict()

I saved my network as

checkpoint = {'epoch': epoch, 'loss': loss, 'model_state_dict': network.module.state_dict(), 
                       'optimizer_state_dict': optimizer.state_dict()}
                       torch.save(checkpoint, 'check-point.pth')

but was able to recover it using

with open('check-point.pth', 'rb') as f:
            checkpoint = torch.load(f)
            network.load_state_dict(checkpoint['model_state_dict'])
            optimizer.load_state_dict(checkpoint['optimizer_state_dict'])
            epoch = checkpoint['epoch']

though I had to do nn.DataParallel(network) afterwards. Shouldn’t it through an error though since to recover the model network.module.load_state_dict() should have been used?

If network is a “standard” model, i.e. not wrapped in nn.DataParallel, the workflow should be fine.
Once you wrap your model in nn.DataParallel, you would have to load the state_dict via network.module.load_state_dict.

I’m not sure I understand the question correctly, but why do you think it should thrown an error?