Zero_grad() gives None gradient

I’m trying to implement this tutorial

when I arrived to this point and print the output it gave me None, I couldn’t find a similar problem on the internet that explains why.
net.zero_grad() # zeroes the gradient buffers of all parameters

print('conv1.bias.grad before backward')

but after applying backward the results are good

print('conv1.bias.grad after backward')

The .grad field is None by default and corresponds to being full of 0.
So until you actually compute any gradient for this Variable, this field will remain None.


thanks for clearing this up. it is just the tutorial said I should get zero when I zero the gradient.