I am looking for a way to add new conv2d and relu layers in between the backbone and RPN of Faster R-CNN. I have already created a custom nn.module for new layers. Moreover, I discovered the existence of forward hooks that is called after each forward pass. So I thought that I can get the output activations of the last layer of the backbone of resnet and feed these activations to my new module and get activations from it so that I can append these new activations to input of RPN.
Is this a valid way to modify such kinds of networks?
If it is a valid way, does my new nn.module get gradient updates as usual?
Thank in advance for any help.