I would like to compute a backward pass starting from a certain Variable, which is not the last one, in a certain direction (directional derivative).
So in PyTorch language, this would correspond to
torch.autograd.backward(var, grad_variables=dir)
where var is the variable from which I would like to start the backward pass and dir is the direction.
My guess is that I have to attach a backward hook to a certain Variable and then let the hook return my desired direction. However, I do not know how to get access to the Variable inside the network that I am interested in.
Thanks in advance for any help!