Backwards hook dimensions unexpected

Consider the following code:

def hook(module, gradIn, gradOut):
print(gradOut)
print(gradIn)
test_model = nn.Linear(3,1)
test_model.register_backward_hook(hook)
test_inp = torch.randn(3, requires_grad=True)
out = test_model(test_inp)
out.sum().backward()
print(out)
print(test_inp)

The output is:

(tensor([1.]),)
(tensor([1.]), tensor([1.]))
tensor([0.7014], grad_fn=)
tensor([-0.0393, -1.4974, 1.3676], requires_grad=True)

Why does the second output line have 2 elements when there are 3 inputs?

This is a known issue: https://github.com/pytorch/pytorch/issues/12331

Try use register_hook on each input independently.

def hook(grad):
    print(grad)

test_model = nn.Linear(3,1)
test_inp = torch.randn(3, requires_grad=True)
test_inp.register_hook(hook)
out = test_model(test_inp)
out.sum().backward()
1 Like