How to load a pytorch model without having to import the class

I have a notebooks where I have my model and I saved the model.Is there a way on loading the model without importing the class definition ,because that is taking time .

I tried, path)
and tried to load from another notebook using torch.load().I f import the class definition it works.


I believe torch.load doesn’t expect class definition, only load state dict class definition ?

This link clearly explains different possibilities

Thanks.But it gives me a unpickle error if am using torch save and load.I am trying to load it from a different ipynb file.

# Model class must be defined somewhere
model = torch.load(PATH)

it is from pytorch saving_loading_model

Can you share the error message ?

torch.load(‘issue.pth’) gives me

AttributeError Traceback (most recent call last)
----> 1 a=torch.load(‘issue.pth’)

/anaconda3/lib/python3.6/site-packages/torch/ in load(f, map_location, pickle_module, **pickle_load_args)
385 f =‘rb’)
386 try:
–> 387 return _load(f, map_location, pickle_module, **pickle_load_args)
388 finally:
389 if new_fd:

/anaconda3/lib/python3.6/site-packages/torch/ in _load(f, map_location, pickle_module, **pickle_load_args)
572 unpickler = pickle_module.Unpickler(f, **pickle_load_args)
573 unpickler.persistent_load = persistent_load
–> 574 result = unpickler.load()
576 deserialized_storage_keys = pickle_module.load(f, **pickle_load_args)

AttributeError: Can’t get attribute ‘CharLoopConcatModell’ on <module ‘main’>

You have to define the class of the model before you load it from disk.

Just to end this discussion: You can do it with

model_scripted = torch.jit.script(model) # Export to TorchScript‘’) # Save

And then load it with:

model = torch.jit.load(‘’)

See also Saving and Loading Models
Under Export/Load Model in TorchScript Format

1 Like