Torch.no_grad() removes requires_grad when the weight tensor updated by non-compound-operation

I created a simple Linear Regression using PyTorch (version 1.11)

My question is :
is not the statement “w -= lr * w.grad” same as “w = w - lr * w.grad”
Then why in one case my program passes, in the other case, it fails.

It fails when my code has:
with torch.no_grad():
w = w - lr * w.grad


But it works fine when I change the line as::
with torch.no_grad():
w -= lr * w.grad

tensor(2.0000, requires_grad=True)

In [19]: torch.version
Out[19]: ‘1.11.0’

I already filed a bug which is closed now. Bug number #74158
I think this is a valid bug, though a minor one.
Please comment. Thank you for your help.

No, it’s not the same as the former applies the subtraction inplace on w.

From the docs of no_grad():

In this mode, the result of every computation will have requires_grad=False, even when the inputs have requires_grad=True.

The w = w - lr * w.grad is creating a new w tensor inside the no_grad context, which will not require gradients, so this is expected behavior.

Thank you for clarifying.
So let it be as it is.