I’m now trying to implement Guided backpropagation from https://arxiv.org/abs/1412.6806 , which alters the gradient calculation in ReLU layer.
When given model is implemented in the way containing nn.ReLU() instance as its submodule, I can easily register backward hook function to its every submodule as below.
model = resnet101(pretrained=True)
def hook(module, grad_input, grad_output):
if isinstance(module, nn.ReLU):
# do something
return changed_grad_input
for i, v in model.named_modules():
v.register_backward_hook(hook)
However, if the model applies ReLU within its forward function with functional form, as below,
class foo(nn.Module):
def forward(self, x):
x = poo(x)
x = F.relu(x)
return x
it seems above approach doesn’t register hook function for that.
How can I deal with hook function in this kind of situation?