C++ inferenec on detection model

Hello, I have trained a model with pytorch, but how to inference using c++.

It’s a detection model similar as RFBNet and SSD, so is there anyone try to load model in C++?

MaskRCNN-benchmark has a PR for tracing the model.

Best regards

Thomas

To use a PyTorch model for inference in C++, we need to convert the model to TorchScript first. This is a good tutorial: https://pytorch.org/tutorials/advanced/cpp_export.html.

1 Like

I know, I have successfully trace a classification model to C++, not sure about the detection models.

BTW, does there any comparation of C++ inference speed with libtorch and Caffe or Tensorflow? Who is faster in terms of same models architecture?

1 Like

@jinfagang do you successfully run model inference from c++?

1 Like

Yes, but it doesn’t get any speed up (just a littel).

I am going using onnxruntime as inference engine instead. it can using intel cpu mkl as acceleration or using cuda tensorrt acceleration.

1 Like

Did you face any difficulties in exporting the detection model into C++?
did everything run smoothly for you?