How to compute the gradients of non leaf variables in PyTorch

import torch
from torch.autograd import Variable

x = Variable(torch.ones(2, 2), requires_grad=True)
y = x + 2
z = y * y * 3
out = z.mean()
out.backward()

In this example, only the gradient of x can be avaliable, but how to achieve the gradient of y?

You should use register_hook before backward call.
e.g.

def hook_y(grad):
    print(grad)

x = Variable(torch.ones(2, 2), requires_grad=True)
y = x + 2
z = y * y * 3

y.register_hook(hook_y) 

out = z.mean()
out.backward()

If you use master branch version of pytorch (must be built yourself from source),
i think it is possible torch.autograd.grad function would a better choice.
But i haven’t tried .

1 Like

Thank you very much!

The following also works.

x = Variable(torch.ones(2, 2), requires_grad=True)
y = x + 2
y.retain_grad()
z = y * y * 3
out = z.mean()
out.backward()
print(y.grad)
3 Likes