You should use register_hook before backward call.
e.g.
def hook_y(grad):
print(grad)
x = Variable(torch.ones(2, 2), requires_grad=True)
y = x + 2
z = y * y * 3
y.register_hook(hook_y)
out = z.mean()
out.backward()
If you use master branch version of pytorch (must be built yourself from source),
i think it is possible torch.autograd.grad function would a better choice.
But i haven’t tried .