Backward starting from intermediate layer

I would like to compute a backward pass starting from a certain Variable, which is not the last one, in a certain direction (directional derivative).

So in PyTorch language, this would correspond to
torch.autograd.backward(var, grad_variables=dir)
where var is the variable from which I would like to start the backward pass and dir is the direction.

My guess is that I have to attach a backward hook to a certain Variable and then let the hook return my desired direction. However, I do not know how to get access to the Variable inside the network that I am interested in.

Thanks in advance for any help!

Maybe a simpler question would be, given the ResNet architecture, how can I gain access to all the variables that are the outputs of, let say, ReLU layers?

Loop through children of the Sequential containing those ReLU layers. Check if that layer is ReLU, if so, add a forward hook.

Thanks!!! I think I got it.