Advantages & Disadvantages of using pickle module to save models vs torch.save

I was reading Save and Load model but it wasn’t clear why I’d use torch.save over pickle.dump.

What worries me is that my Neural Net modules/object have many more things inside of them besides only parameters. Since pytorch saves things using the state_dict that worried me that something might be missing (https://pytorch.org/tutorials/beginner/saving_loading_models.html).

Isn’t for my use case easier to simply use the pickle.dump function? Why would I use torch.save in general?

Note I am using gpu’s. Not sure if that matters.

The easiest thing for me is if I could just save the whole python program state (similar to how matlab used to allow you to do it https://www.mathworks.com/help/matlab/ref/save.html).

2 Likes

The state_dict will store all registered parameters and buffers.
If you need to serialize some tensors, you should thus create an nn.Parameter, if it’s trainable, or a buffer via self.register_buffer(name, tensor), if it’s not trainable.

Answered in the other thread.

That’s generally not recommended.
Instead you should store the state_dicts and the source files separately.
Storing e.g. the complete model could force you to recreate exactly the same file and folder structure.

1 Like

Hi @ptrblck thanks for your responses! Greatly appreciated :slight_smile:

I am curious, why is it not recommended to store the whole python program state? (or as close to that as possible).

I am currently using dill to simulate this as much as possible because my neural network module classes contain pointers to many things, including lambda functions I’d like to restore properly.

reference:

As mentioned in my last post, you could be forced to recreate the exact find and folder structure, if you save the complete model via torch.save.
I don’t know how dill is working in this case.

yes, the same file structure needs to maintain.
moreover, the tensor function will still be the one used when saving.
i loaded a yolov5 model which is version 4.0 and uses silu as act. but i accidentally used local version of yolov5 version1.0 which uses leaky relu. however, the model still uses silu. i guess the function is stored in tensor.

Is the with statement necessary in case we use torch.save? That is, should we write something like:

with open('foo.pt', 'wb') as fhand: torch.save(obj_to_save, fhand)

?

No, since torch.save will create and open the file for you if it doesn’t exist.