My goal is to generate FastRCNN inference with my own proposals. Unfortunately, there is no implementation of this architecture in Pytorch so i decided to implement it by myself based on the FasterRCNN implementation.
I already tried to make my custom FastRCNN architecture based on FasterRCNN (torchvision.models.faster_rcnn) and GeneralizedRCNN (torchvision.models.generalized_rcnn) by removing the RPN parts and including the proposals parameter in the GeneralizedRCNN forward method.
I successfully loaded a resnet50 backbone but i didn’t found a way to load a pre-trained FastRCNNPredictor which is wrapped into the pre-trained model “fasterrcnn_resnet50_fpn_coco”.
Is it possible to extract the FastRCNNPredictor from a pre-trained model and to add it to my architecture ?
You should be able to directly access the predictor via:
# > odict_keys(['cls_score.weight', 'cls_score.bias', 'bbox_pred.weight', 'bbox_pred.bias'])
and could thus try to load this
state_dict into your new model’s submodule.
Thank you for your response.
I managed to successfully load backbone and box_predictor.
I have a general question. My final goal is to export the model in ONNX format in order to use it in C++ with OnnxRuntime. As i said, I based my custom FastRCNN architecture on FasterRCNN. When I export the model in ONNX format, should I expect that the post-processing parts (like box_regression applied to input proposals or box decoding as done in the FasterRCNN architecture) are also exported ? Or do I need to solely export the “processing parts” up to the box_predictor outputs and re-implement the post-processing operations in C++ ?
I’m not familiar enough with the ONNX export process for
FasterRCNN, but I would assume that the “entire” model is exported as it’s passed to
torch.onnx.export. I.e. if the post-processing steps are part of the model’s
forward pass, I would assume they are also in the exported model.
One thing that you could check is if any methods are decorated with
torch.jit.ignore, which wouldn’t be scripted.
Ok I will take a look.
Thank you for your quick responses and for your work