A "Nonetype" problem about grad

Hey,guys.I’m a beginner of machine learning and pytorch.Now i’m follow a chinese guide video but meet a problem i can’t solve.I follow the teacher and create a sample nn in a stupid way but when i train it ,my grad become “Nonetype” and i don’t know how to do.Here is my code :

n,input,h,output=64,1000,100,10

x=torch.randn(n,input)
y=torch.randn(n,output)

w1=torch.randn(input,h,requires_grad=True)
w2=torch.randn(h,output,requires_grad=True)

learningrate=0.000001

for t in range(500):
#forward
h=x.mm(w1)
h_relu=h.clamp(min=0)
y_pred=h_relu.mm(w2)

#loss 


loss=(y_pred-y).pow(2).sum()
print(t,loss.item())


#backward 
loss.backward()




with torch.no_grad():
    w1=w1-learningrate*w1.grad.data
    w2=w2-learningrate*w2.grad.data
    w1.grad.zero_()
    w2.grad.zero_()

AttributeError Traceback (most recent call last)
in
35 w1=w1-0.00001w1.grad.data
36 w2=w2-0.00001
w2.grad.data
—> 37 w1.grad.zero_()
38 w2.grad.zero_()
39

AttributeError: ‘NoneType’ object has no attribute ‘zero_’

I will be very grateful if some master give me a hand and show me the way to dell with the problem.

You are overwriting your parameters in the update block.
To avoid this. you could use inplace operations:

with torch.no_grad():
    w1.sub_(learningrate*w1.grad.data)
    w2.sub_(learningrate*w2.grad.data)
    w1.grad.zero_()
    w2.grad.zero_()
3 Likes

Thanks,
It really helps a lot.Finally,i can finish my home work!:blush: