Print gradient on the parameters

What is the solution to print out the gradient on the parameters, in the newest version of PyTorch?

I did:

for p in model.parameters():
    print(p.grad.norm())

It gave me that p.grad is None.

The loop should print gradients, if they have been already calculated.
Make sure to call backward before running this code.
Also, if some parameters were unused during the forward pass, their gradients will stay None.

2 Likes