When can the var.grad be None?

I had a trivial/simple script and it says the gradients are None:

def check3():
    x = Variable( torch.FloatTensor(1).fill_(3), requires_grad=True)
    y = 2*x
    y = y + y
    y.backward()
    print(f'y.grad={y.grad}')

gives:

y.grad=None

why?

Autograd computes gradients for leaf nodes; ie, Variables with requires_grad=True. In this case, x is a leaf node so x.grad exists but not y.grad

1 Like