Saving custom models

I have defined a custom model as follows:

class model(torch.nn.Module):
	"""Defines custom model
	"""
	def __init__(self, dim_input, dim_output):
		
		super(model, self).__init__()
		self._dim_input = dim_input
		self._dim_output = dim_output
		
		'''Initialize nnet layers'''
		self._l1 = torch.nn.Linear(self._dim_input, SIZE_H1)
		self._l2 = torch.nn.Linear(SIZE_H2, SIZE_H1)
	self._l3 = torch.nn.Linear(SIZE_H3, SIZE_H2)


	def forward(self,x):
		self._l1_out = nn.ReLU(self._l1(x))
		self._l2_out = nn.ReLU(self._l2(self._l1_out))
		self._l3_out = nn.BatchNorm1d(self._l3(self._l2_out))
		self._out = nn.Sigmoid(self._l3_out)

		return self._out

How do I save this as a .t7 file? Thanks

Why not as a .pth with torch.save(model, "model.pth") ?

Kinda embarrassed, but I had the order of the filename and the module reversed in torch.save()… Thanks anyway!

Spoke too soon - I tried saving it as a .t7 file but I can’t load it with load_lua

I want to load this like I would load a Sequential model. Can I not do that?

PyTorch serialization uses a different format than Lua torch. You can load .t7 files with load_lua, but the ones saved with torch.save in Python are only readable in Python.

Oh, do you have any suggestions then on how I can best load/save this custom model in python? Should I just do my best to just write my model as a nn module (eg. using Sequential) and load/save with torch.save() and load_lua respectively?

nvm, got it - thanks!

BTW we recommend using state_dicts for serializing models.

I have a model called vae.py and I am trying to save it, but I don’t understand what is the required argument " "f = model.pth" mean? Also, how can I use state_dicts? Thanks a lot.

I’m sorry, but I don’t understand the first part of you question. You can obtain a state_dict using a state_dict() method of any module. Once you resume the training from a checkpoint, you should still create a new model with random weights, and call load_state_dict(serialized_dict) on it. This will replace the random values with serialized weights.

1 Like

Hey @Anmol6 did you find a way to save model in pytorch and load it in lua?

Thanks

Hey , I am beginner and was trying to save parameters of a pretrained network in hdf5 file and wanted to load it in torch , but unsuccessfull .

Could you please let me know how to save parameters of a pretrained network (which is in pytorch ) in hdf5 file .

Thanks!

I can tell you how I saved the model and it worked

import pickle

with open("./DUMPS/model_DUMP.txt",“wb”) as file:
_ pickle.dump(model,file)_

This saved the model, then when I started training the model instead of starting from zero I loaded the saved model:

with open(“DUMPS/model_DUMP.txt”,“rb”) as mod:
_ existing_model = pickle.load(mod)_

and loaded the state of the saved model to a already initialized model:

shared_model.load_state_dict(existing_model.state_dict())

The training continued from where I stopped the training.

does it save the model in pytorch and load it in lua ?

Do you mean if pickle can save the model in python and load it in lua?
I think not because pickle is python specific package. I don’t know lua though, so maybe there exists some
workaround or a specific lua package that can read pickle serializations.

Yes , pickle does not load in lua .
I was trying saving weights in hdf5 file and then loading in lua , but it was not working out because I am making error somewhere .
But this is the error which I get .
Any workaround ?

File “scripts/run_model.py”, line 387, in
main(args)
File “scripts/run_model.py”, line 115, in main
fout.create_dataset(‘pgparams’,data=pgparams)
File “/usr/local/lib/python2.7/dist-packages/h5py/_hl/group.py”, line 105, in create_dataset
dsid = dataset.make_new_dset(self, shape, dtype, data, **kwds)
File “/usr/local/lib/python2.7/dist-packages/h5py/_hl/dataset.py”, line 93, in make_new_dset
tid = h5t.py_create(dtype, logical=1)
File “h5py/h5t.pyx”, line 1450, in h5py.h5t.py_create (/tmp/pip-4rPeHA-build/h5py/h5t.c:16078)
File “h5py/h5t.pyx”, line 1470, in h5py.h5t.py_create (/tmp/pip-4rPeHA-build/h5py/h5t.c:15912)
File “h5py/h5t.pyx”, line 1525, in h5py.h5t.py_create (/tmp/pip-4rPeHA-build/h5py/h5t.c:15813)
TypeError: Object dtype dtype(‘O’) has no native HDF5 equivalent

Thanks !

@Soumith_Chintala If you could please help me.

Thanks a lot !

Hey, Rao, I know this post is a few months old, but were you ever able to package the model to HDF5? Thanks.