Faster RCNN C++

Looking to see if anyone has succesfully deployed a Torchvision Faster RCNN (or Mask RCNN) model to C++ via torchscript/libtorch.

In python

model = torchvision.models.detection.maskrcnn_resnet50_fpn(pretrained=True)
script_model = torch.jit.script(model)
script_model.save("model.pt")

In C++

module = torch::jit::load("model.pt");

I linked to library with QMake in QT Creator with this .pro in Windows 10

QT -= gui

CONFIG += c++14 console no_keywords
CONFIG -= app_bundle

QMAKE_CXXFLAGS += -D_GLIBCXX_USE_CXX11_ABI=0
QMAKE_LFLAGS += -INCLUDE:?warp_size@cuda@at@@YAHXZ

DEFINES += QT_DEPRECATED_WARNINGS

SOURCES += \
        main.cpp

LIBS += -LD:\src\PyTorch_Playground\c++\testLibTorch\libtorch-win-shared-with-deps-debug-latest\libtorch\lib \
-lc10 -lc10_cuda -ltorch_cuda -ltorch_cpu -ltorch

INCLUDEPATH += D:\src\PyTorch_Playground\c++\testLibTorch\libtorch-win-shared-with-deps-latest\libtorch\include
INCLUDEPATH += D:\src\PyTorch_Playground\c++\testLibTorch\libtorch-win-shared-with-deps-latest\libtorch\include\torch\csrc\api\include

This works when the model loaded is a ResNet model as well as FCN-ResNet. Both run and give correct outputs but RCNN models throw an exception

schemas.size() > 0 INTERNAL ASSERT FAILED at "..\..\torch\csrc\jit\frontend\schema_matching.cpp":491, please report a bug to PyTorch.

Torchscript RCNN model has no problem running in python.

There is a few month old github issue (of which I added my comments) https://github.com/pytorch/pytorch/issues/35881

but since this is seriously blocking my transition from tensorflow to pytorch I’m hoping crossposting it here will attract more eyes. Also I’m not sure if it’s a bug or a mistake on my part or I’m not linking correctly. Also wondering if people have ever had this working perhaps on previous versions. I’m using nightly builds because I was playing with AMP and autocast.

I am trying the same thing. Were you able to do it?

Nope. I have made no progress on this

Fairly new to PyTorch and for me loading faster-rcnn model is failing as well. Tried following the directions here: https://pytorch.org/tutorials/advanced/cpp_export.html

An exception is thrown here:

module = torch::jit::load(“Path_to_model”);

Has anyone figured this out?

1 Like