How to register hook function for functional form?

I’m now trying to implement Guided backpropagation from https://arxiv.org/abs/1412.6806 , which alters the gradient calculation in ReLU layer.

When given model is implemented in the way containing nn.ReLU() instance as its submodule, I can easily register backward hook function to its every submodule as below.

model = resnet101(pretrained=True)
def hook(module, grad_input, grad_output):
    if isinstance(module, nn.ReLU):
        # do something
        return changed_grad_input
for i, v in model.named_modules():
    v.register_backward_hook(hook)

However, if the model applies ReLU within its forward function with functional form, as below,

class foo(nn.Module):
    def forward(self, x):
        x = poo(x)
        x = F.relu(x)
        return x

it seems above approach doesn’t register hook function for that.

How can I deal with hook function in this kind of situation?

Hi,

Unfortunately I’m not sure if it is easy to do this.
I guess you could always replace F.relu by a custom function that would use the nn.Module version with your hook.
Otherwise, you can add a hook on any tensor you want, but that would require modifying the forward method of your module.

1 Like