Load model saved without state_dict

I saved my model as torch.save(model), and I want to send it to another person. But when I load the model from another computer, I get pickle error. How Can I solve it?

I know recommended method, but my aim is to save model as one file without dependencies(smth like tf frozen graph).

can you share the exact error message and the code to show how you are loading, if possible?
the question is too broad (pickle error?) to answer and solicits a hypothetical thinking. Hope you understand.

File “/mnt/media/users/renatkhiz/research/detection/configs/human.py”, line 90, in
model.finetune_from(finetune_path)
File “/mnt/media/users/renatkhiz/research/detection/models/mobilenetv1ssd.py”, line 88, in finetune_from
weights = torch.load(path, map_location=‘cpu’)
File “/home/renatkhiz/anaconda3/lib/python3.7/site-packages/torch/serialization.py”, line 367, in load
return _load(f, map_location, pickle_module)
File “/home/renatkhiz/anaconda3/lib/python3.7/site-packages/torch/serialization.py”, line 538, in _load
result = unpickler.load()
ModuleNotFoundError: No module named ‘models/mynet’

This is an error.
I use torch.save(model) in my script infrastructure, send saved model to my colleague and he uses torch.load(path) to load it. Model is nn.Module.

I found one way to solve this problem: by using torch.jit.trace and than use standard torch.save and torch.load.
And I have new question: does torch.jit.trace supports dynamic graph or it makes graph static?

1 Like