LibTorch(C++) => TorchScript => PyTorch(Python) example

Hi,

Are there examples to go from C++ to PyTorch in terms of model saving and loading?

All the examples online are for the reverse case (prototyping in Python => TorchScript => serving the model in C++).

I couldn’t figure a way to trace a model in C++ and save it to the disk.

Thanks.

@abhinavkulkarni
See my post before.
Load saved file from LibTorch to PyTorch failed

@glaringlee: Can you please explain how you save a nn::Module in libtorch to file? And how to load that file in Python? When I try your suggestion and do

torch::save(model, "model.pt");

I get errors of the type

‘torch::serialize::OutputArchive’ is not derived from ‘std::basic_ostream<_CharT, _Traits>’
   archive << value;

Hey, can you provide more info?
What’s your pytorch version?
Can you provide more code that enough for me to test on my end?

Hi, I’m using libtorch 1.7.1, but that shouldn’t really matter. As for code, you can take this simple MNIST example and add the line of code above to save the model. It won’t compile.
In any case, that’s not the point of the ticket. I know how to export a model in libtorch (serializing to OutputArchive, etc), but I cannot load that model in Python. The problem I’m trying to solve is how to export a model in libtorch that can be loaded in Python.

@botelho version is really matter to us since the serialize code keep changing between versions.
The example was written for libtorch 1.4, I think we need to update it.
There is a post here which can solve your problem. Please take a look at.

Hi @glaringlee. I don’t think you read my last comment until the end. I already know how to export a model and read it back in libtorch. What I cannot do is to take a model saved by libtorch and load it in python. In fact, this is the subject of this ticket: LibTorch(C++) => TorchScript => PyTorch(Python).

@botelho Sry, but I got confused.
You original said torch::save(model, "model.pt"); got compile error. If you solved this problem, then you should be able to load your saved model.pt in python using torch.jit.load which is within the link that I pasted earlier in this post, no?

here is the link ‘Load saved file from LibTorch to PyTorch failed - #5 by glaringlee

Please make sure the version of libtorch and pytorch must match.

Hi @glaringlee. Actually, doing

module = torch.jit.load('model.pt')

does not work. That is, the load call itself returns without errors, but if you then try to do a forward pass on the model, you get something like this:

Traceback (most recent call last):
  File "./load.py", line 63, in <module>
    main()
  File "./load.py", line 59, in main
    output = model(data)
  File "/home/botelho/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/home/botelho/.local/lib/python3.6/site-packages/torch/jit/_script.py", line 558, in __getattr__
    return super(RecursiveScriptModule, self).__getattr__(attr)
  File "/home/botelho/.local/lib/python3.6/site-packages/torch/jit/_script.py", line 288, in __getattr__
    return super(ScriptModule, self).__getattr__(attr)
  File "/home/botelho/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 779, in __getattr__
    type(self).__name__, name))
torch.nn.modules.module.ModuleAttributeError: 'RecursiveScriptModule' object has no attribute 'forward'

And yes, both my libtorch and pytorch versions are 1.7.0.

@botelho
I see the difference now. I am sry, my bad. I had a wrong impression of my previous post. The save/load from libtorch to pytorch direction supports saving tensor only at this point.

Hi @glaringlee. Sounds good, thanks for confirming.