Zero_grad() gives None gradient

I’m trying to implement this tutorial http://pytorch.org/tutorials/beginner/blitz/neural_networks_tutorial.html

when I arrived to this point and print the output it gave me None, I couldn’t find a similar problem on the internet that explains why.
net.zero_grad() # zeroes the gradient buffers of all parameters

print('conv1.bias.grad before backward')
print(net.conv1.bias.grad)

but after applying backward the results are good
loss.backward()

print('conv1.bias.grad after backward')
print(net.conv1.bias.grad)

The .grad field is None by default and corresponds to being full of 0.
So until you actually compute any gradient for this Variable, this field will remain None.

2 Likes

thanks for clearing this up. it is just the tutorial said I should get zero when I zero the gradient.