Hi this is a beginner question and probably really quickly answered. But I could nto find it online, or probably to know the proper vocabulary to find what I am looking for.

I am trying to save the gradients into the variable “grads”, as in the code below.

```
grads=[0,0]
with torch.no_grad():
for idx,p in enumerate(model.parameters()):
#save current gradients
grads[idx]=p.grad
p.sub_(lr * p.grad)
p.grad.zero_()
```

However, `p.grad.zero_`

always not only sets the `p.grad`

but also `grads`

to zero.

I understand that it has todo with the way these two variables are connected through a graph. But how do I stop `p.grad.zero_`

form setting also `grads`

to zero.

I tried using `detach`

, but that did not work.

```
grads=[0,0]
with torch.no_grad():
for idx,p in enumerate(model.parameters()):
#save current gradients
grads[idx]=p.grad.detach()
p.sub_(lr * p.grad)
p.grad.zero_()
```