with torch.no_grad():
w -= 0.01 * w.grad
w.grad.zero_()
Everything works well with no error.
However, if I code:
with torch.no_grad():
w = w - 0.01 * w.grad
w.grad.zero_()
I meet the following error:
Traceback (most recent call last):
File "/home/jihao/deep_learning/auto_gradient.py", line 34, in <module>
w.grad.zero_()
AttributeError: 'NoneType' object has no attribute 'zero_'
What’s the difference between the two snips of code?