In your example, it just prints a Tensor of shape 5x5 with all ones in it. If I use a working example (for example the tutorial on CIFAR-10 dataset: https://github.com/pytorch/tutorials/blob/master/Deep%20Learning%20with%20PyTorch.ipynb) - doing a single iteration - and I write:
loss.register_hook(lambda g: print(g))
I get:
Variable containing:
1
[torch.FloatTensor of size 1]
Variable containing:
1
[torch.FloatTensor of size 1]
which isn’t very helpful. Now, on my example, if I want all the gradients which are computed on my loss function, how can I use register_hook to do so?