Hi, I am currently trying to differentiate a function that contains Legendre functions. I define these recursively in a matrix. I wrote a simple code for computing the geometric series that reproduces the error:

```
import torch as t
def powers_of_x(N,X):
Y = t.ones((N,))
for I in range(1,N):
Y[I] = Y[I-1]*X
return Y
X = t.tensor([2.0], requires_grad=True)
SERIES = powers_of_x(9,X).sum()
SERIES.backward()
X.grad
```

I guess torch.autograd doesn’t like the fact that I’m overwriting the ones vector, i.e. constant variables. How do I surpass this? I would greatly appreciate any help.