Access SSD Layers for Grad Cam Visualization

Hi. I am trying to visualize the Class Activation Maps for my trained Object Detection model using Grad Cam. The model is a SSD model provided on torch hub here which I further finetune on my custom dataset.
I need help with two questions:

  1. What’s the correct way or layer to use here to hook onto to get the gradients and visualize the activation maps? I am currently looking at the last ReLU activation layer in the feature_extractor layer.
Bottleneck(
  (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
  (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
  (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
  (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (relu): ReLU(inplace=True)
)
  1. I am trying to understand how to access that layer. I have tried to access the children of the model using something like this:
modules = list(model.feature_extractor.children())
modules[0][6][5]

but I guess this is not a correct way to grab this layer outputs.
Can someone please guide me here as to how to proceed?
Thanks :slight_smile:

Hi @anujd9

  1. Assuming your ssd model is instantiated as model, you should be able to access to the last ReLU layer of its feature_extractor by model.feature_extractor[-1][-1].relu
    By using backward hook function (i.e., register it by model.feature_extractor[-1][-1].relu.register_backward_hook(your_backward_hook_func)`), you can get gradients at the layer.

  2. To get the output from the last ReLU layer of its feature_extractor, you can use ForwardHookManager in torchdistill. Here is an example notebook to demonstrate the feature.
    In Section 3, you can use feature_extractor.-1.-1.relu as the module path for torchdistill like forward_hook_manager.add_hook(model, 'feature_extractor.-1.-1.relu', requires_input=True, requires_output=True)

1 Like