How to save torchscript model using cpp?

I know detail about how to save torchscript model using python, and load it with c++ from the doc: link.

But how to save torchscript using cpp?I should train my model using c++, but I don’t find any detail about it in document.


You can refer to the repository below to do what you want.

  1. First create a C ++ model with the same structure as the Python model
  2. Save your Python model as a JIT script.
  3. Instead of loading the model into a JIT script, you use torch :: load to load it.

Created by referring torchvision.

Hi, Thanks for your reply.
But I should train model with C++ and save the model as JIT script. The reason why I should save model as JIT script is that the inference code can’t access to model structure(the model defination class). Thus, I should find a way to save model as JIT script using C++.
However, the code you provide above should pass Model instance as parameter, so It doesn’t meet my need.

I don’t know
Is it to avoid exposing the model structure in inference?

It is one aspect. On the other hand, the Script Model has high-performance.

Hi, have you solved it? I have the same problem, and I try to trian the script model in C++, but it not works, could you please give me some help while you have train it successfully?thanks

1 Like

Hi, I found that both torchscript and torch::nn::Module model file can be loaded with torch::load(model, "");. It isn’t nesscery to save torchscript model in cpp while you load torchscript model to nn::Module and train it.

link : (libtorch) Save MNIST c++ example's trained model into a file, and load in from another c++ file to use for prediction?

@yf225 I am sorry to disturb, Could u give me a little hint about whether libtorch could save torchscript model using cpp or not. Thank you so much!

@SixerWang By “saving a TorchScript model in C++”, do you mean one of the following:

  1. You want to embed Python/TorchScript code in your C++ code, and then save that portion of code to a TorchScript model file
    Answer: This is not supported right now, but it is possible to implement / expose this API. I’d suggest opening an issue in and we can discuss the use cases.
  2. You want to save a model written in C++ frontend to a TorchScript model file
    Answer: This is not supported and likely won’t be supported in the short term, because implementing this mechanism is a huge effort, and we will need more use cases to actually establish this as a project to work on.

Thank you very much for your reply. My situation is the second one. I will change my training process.Thank you!

Hi, have you solved the problem?