The code is run under Pytorch v0.4.0.
a = torch.Tensor(3,3).requires_grad_()
with torch.no_grad():
b = a[1]
print(b.requires_grad)
with torch.enable_grad():
c = b * 2
print(c.requires_grad)
c.sum().backward()
print(a.grad)
The output is
True
True
None
So since a
is not involved in the backward process (namely, not in the autograd computation graph), why b
is set to requiring grad as a leaf variable? It would easily mislead users into believing that a
is also in the autograd computation graph if they didn’t check the grad of a
.