How to redefine the backward function for a nn.functional function like relu

I am trying to plot guided backpropagation for inception_v3. Basically, it need to backpropagate only the positive gradient for relu. However, in inception_v3 model, all the relu operations are defined in class BasicConv2d using nn.functional.relu. How can we redefine the backward function for nn.functional.relu? If we plot guided backpropagation for resnet, it is relatively easy since we just need to register_backward_hook for all relu layers in resnet.

Switch to using module version of relu, or add backward hook on the outputs of relus.

Thanks. That seems to be the only option.