Load saved file from LibTorch to PyTorch failed

I tried to save a tensor in LibTorch and load it from PyTorch but it failed. My LibTorch and PyTorch version are both 1.5.1. Does anyone have any clues or workaround? Thanks!

In C++, I used the following code to save an image tensor.
auto filename = “sample.img”;
torch::save(
fake_img,
torch::str(filename));

In python, I got a runtime error when I tried to load the saved image from C++ code.

import torch
sample_file=f’./sample.img’
module = torch.jit.load(sample_file)

RuntimeError: version_number <= kMaxSupportedFileFormatVersion INTERNAL ASSERT FAILED at /opt/conda/conda-bld/pytorch_1573049387353/work/caffe2/serialize/inline_container.cc:131, please report a bug to PyTorch. Attempted to read a PyTorch file with version 2, but the maximum supported version for reading is 1. Your PyTorch installation may be too old. (init at /opt/conda/conda-bld/pytorch_1573049387353/work/caffe2/serialize/inline_container.cc:131)

If I use Libtorch 1.6 and pytorch 1.6, the error msg is:

RuntimeError: version_number <= kMaxSupportedFileFormatVersion INTERNAL ASSERT FAILED at /opt/conda/conda-bld/pytorch_1573049387353/work/caffe2/serialize/inline_container.cc:131, please report a bug to PyTorch. Attempted to read a PyTorch file with version 3, but the maximum supported version for reading is 1. Your PyTorch installation may be too old. (init at /opt/conda/conda-bld/pytorch_1573049387353/work/caffe2/serialize/inline_container.cc:131)

@Carol_Ye_Liu
try torch.load instead of torch.jit.load.

Thanks. I tried with torch.load, but got following errors: UnicodeDecodeError: ‘ascii’ codec can’t decode byte 0xb3 in position 1: ordinal not in range(128)
RuntimeError: /sample.img is a zip archive (did you mean to use torch.jit.load()?)

And same errors persist when I changed my C++ code to save the image as below and use torch.load to load in python:
auto bytes = torch::jit::pickle_save(fake_img.cpu());
std::ofstream fout(“fake_img.img”, std::ios::out | std::ios::binary);
fout.write(bytes.data(), bytes.size());
fout.close();

@Carol_Ye_Liu
Can you check the pytorch version you installed? It seems to me that when you test pytorch 1.6 + libtorch 1.6, the actual pytorch you were using might not be 1.6.
You can print out the pytorch version by

import torch
print (torch.__version__)

I tried on my machine with latest pytorch + libtorch. It works for me.

If you are trying to store a tensor instead of a module.
You can use pickle_save in c++ and torch.load() in python.
here is the sample code in c++, python is very obvious.

torch::Tensor ten = torch::rand({12, 12});
auto bytes = torch::pickle_save(ten); //this is actually a std::vector of char
std::ofstream fout("ten.zip", std::ios::out | std::ios::binary);
fout.write(bytes.data(), bytes.size());
fout.close()

And in python you can do

import torch
mytensor = torch.load("ten.zip")

If you are trying to save a module using jit.
Then you should do what you did before in c++ and load it using torch.jit.load.
But this time, what you are loading is a module obj instead of a Tensor.
If I use torch.save to save your fake_img tensor, here is what I get when loading it

In [1]: import torch
In [2]: a = torch.jit.load("sample.img")
In [3]: a                                                                                                                                                                                                                                                                                             
Out[3]: RecursiveScriptModule(original_name=Module)
In [4]: a.__dict__                                                                                                                                                                                                                                                                                    
Out[4]: 
{'_initializing': False,
 '_c': <torch._C.ScriptModule at 0x7f07c609bb30>,
 '_parameters': <torch.jit._script.OrderedDictWrapper at 0x7f08232c0a90>,
 '_buffers': <torch.jit._script.OrderedDictWrapper at 0x7f07c655f940>,
 '_non_persistent_buffers_set': set(),
 '_backward_hooks': OrderedDict(),
 '_forward_hooks': OrderedDict(),
 '_forward_pre_hooks': OrderedDict(),
 '_state_dict_hooks': OrderedDict(),
 '_load_state_dict_pre_hooks': OrderedDict(),
 '_modules': <torch.jit._script.OrderedModuleDict at 0x7f08231ae490>,
 '_concrete_type': <torch._C.ConcreteModuleType at 0x7f0820bcf7b0>}
In [5]: a._parameters.keys()                                                                                                                                                                                                                                                                         
Out[5]: ['0']

Your tensor will be in

a._parameters['0']
1 Like

Many thanks for all the details! I double checked my torch versions and found that my latest pytorch is not getting picked up in my jupyter notebook.