Weight doesn't update even though weight.grad is not none

Hi I made a simple code to track nn.Linear’s parameter’s update.

import torch

optimizer = torch.optim.Adam(qwe.parameters(), lr=0.001)

qwe = nn.Linear(1,1)

xx = torch.tensor(torch.ones(1,1), requires_grad = True)
yy =qwe(xx)
zz = yy**2


print('xx requires grad: ',xx.requires_grad)
print('yy requires grad : ',yy.requires_grad)
print("qwe's weight:",qwe.weight.clone())
print("qwe's grad",qwe.weight.grad)
print('zz requires grad :',zz.requires_grad)
print("qwe's weight:",qwe.weight.clone())

The result is

xx requires grad:  True
yy requires grad :  True
qwe's weight: tensor([[0.3074]], grad_fn=<CloneBackward>)
qwe's grad tensor([[1.7211]])
zz requires grad : True
qwe's weight: tensor([[0.3074]], grad_fn=<CloneBackward>)

You can see qwe’s grad is not none but there was no update in qwe’s weight.
How can I see weight’s update ?
Is it possible to catch weight update?


I think you should give qwe.parameters() to the optimizer, not net.parameters().

you are right…
and I should have put

qwe = nn.Linear(1,1)

before optimizer clause.

Thanks a lot for picking up my mistake

hi, albanD, i have a question about share memory, can we disable share memory usage in pytorch? thank you very much.

Could you please open a new topic for unrelated questions please.

hi, i am quite new to pytorch forum, so how can i open a new topic for my question , i have start a question two day ago, but found no reply ? so , does my problem can not seen by you ? i will start a new topic :
the topic link is show here:

i have open a new topic for the problem , thanks very much !