Error when re-defining Faster R-CNN module

Hello everyone,

I’m facing a problem when trying to re-define Faster R-CNN.

I have copied all of the code in the torchvision implementation of Faster R-CNN, properly modifying the path to libraries (ex: prepending ‘torchvision.models.detection’ to libraries requiring it).

The strange thing is: if I do not re-define class FasterRCNN(GeneralizedRCNN) everythin is good and no error arises. But as soon as I write - COPY AND PASTE from the link, no modifications - such class I get an error.

/usr/local/lib/python3.6/dist-packages/torchvision/ops/ in setup_scales(self, features, image_shapes)
    109         # get the levels in the feature map by leveraging the fact that the network always
    110         # downsamples by a factor of 2 at each level.
--> 111         lvl_min = -torch.log2(torch.tensor(scales[0], dtype=torch.float32)).item()
    112         lvl_max = -torch.log2(torch.tensor(scales[-1], dtype=torch.float32)).item()
    113         self.scales = scales

IndexError: list index out of range

Any suggestion?

My current version of torchvision is 0.4.2, which should be the last stable version.

Thank you.

Hello Luppo,

I had the same problem. It worked as soon as I used a String in featmap_names = ['0'].

        roi_pooler = torchvision.ops.MultiScaleRoIAlign(featmap_names=['0'],

Maybe it can help you.


Thank you for your response!
Actually I’ve solved the problem by inspecting the code in my library and extracting the diff with that on github. I then found that box_roi_pool was defined as

box_roi_pool = MultiScaleRoIAlign(
                featmap_names=[0, 1, 2, 3],

In Faster R-CNN code instead, featmap_names=['0', '1', '2', '3']. I really don’t know why this change was made, but changing it solved my problem.

Hope it could help someone else.


Thanks for the solution, it worked for me.
I believe this PyTorch tutorial (which I was following) should be updated to account for featmap_names being list[str], as defined in its documentation.

Thank you! Also helped me a lot