My code is
x = Variable(torch.rand(8, 1, 5, 5), requires_grad=True)
conv = nn.Conv2d(1, 1, 3)
y = conv(x)
final = torch.sum(y)
print(‘y.grad’, y.grad)
final.backward()
print(‘y.grad’, y.grad)
print(‘y.grad’, y.requires_grad)
However, the output is
y.grad None
y.grad None
y.grad True
If y.requires_grad==True, shouldn’t the second y.grad output some gradients instead of None? For x, the gradient showed normally after the backward()