How to load a model from pytorch to caffe2 with c++?

(Hotaek Han) #1

Hello,

I have tried to load the trained model from the pytorch into caffe2 or TensorRT.
I tried…

  1. First, export onnx format then read it in caffe2.
    -> It is ok for python. But I can’t figure out how to do it for c++. I meant I can’t find caffe2 api for onnx backend.

  2. First, export onnx format then read it in TensorRT.
    -> My model includes CNN and LSTM layer like this (CNN-Pool-CNN-Pool-Reshape-LSTM).
    -> The input shape of pytorch’s LSTM is (seq x batch x feat). So I reshaped the tensor from Pool layer from (batch x channel x 1 x width) to (seq x batch x feat).
    -> But, TensorRT doesn’t support moving the batch dim.

I think case 2) is related in TensorRT. But case 1) is related in pytorch/caffe2.
Any helps?

Thanks

(Alex Veuthey) #2

I don’t know about TensorRT, but for Caffe2, you can save the Caffe2 model in predict and init files, like shown in this tutorial. Then I guess you can import them in the C++ API? Haven’t tried it but that seems like the way to go…

(Hotaek Han) #3

Thanks for your reply.
I read that tutorial. And…yes I tried to same thing in the C++.
But I can’t find any sample code or tutorial in the C++ for caffe2.
I have found the C++ api list for caffe2. but…looks little messy…:frowning:

oh, I found the tutorial :slight_smile: