Pytorch.load() error:No module named 'model'

when i ues pytorch.load() , it give an error : No module named ‘model’

import torch
path = “D:\python\my_ML\model\resume.49.pkl”
LM_model = torch.load(path)


First, you should not serialize models but just their state_dict() to avoid such problem. Then you can recreate the model and load_state_dict() into it to get all the weights back.

This is a problem of python serialization, you should have exactly the same imports as when you saved the model when loading. You should import the model module the same way as it was done when you saved.

Thank you very much. It has been done by the way you said. Since the model needs to be initialized by passing in a ‘’ data ‘’ object that contains parameters to build the model, I save the model directly. If only save the parameters, I need to create a ‘’ data ‘’ object when you build the model. It’s troublesome. Is there any good way?

I am not sure to understand in details what you do here.
But maybe you can make this data object a state dict? That way you just need to save that, then load it and build your new model with it?

Hi @albanD, I’m using someone else’s repo as a module in my code, and thus I have to change the Python imports to relative imports, which gives me this model load error for the reason that you mentioned here. I was wondering if there is a way that I could still load the checkpoint given that I need to change the imports?

Hi @meshghi I’m facing similar issue, got any solutions?