Can you explain more clearly how to wrap the current model in nn.Dataparallel? Like can you give an example?
I have the error Missing key(s) in state_dict:
and actually when I save the model, I just use torch.save().
I am new to pytorch, thanks so much!