Requires_grad of list[0] Disappears

def f():
    x = []
    for i in range(3):
        x.append(torch.tensor(i * 1.).clone().requires_grad_(True))
    print(x)

Expected results:

[tensor(0., requires_grad=True), tensor(1., requires_grad=True), tensor(2., requires_grad=True)]

This function can work well in the terminal. However, in my project code (in the same env), x[0].requires_grad is always False (even if I try to set it to True afterward) while others are okay. It is quite mysterious. Does anyone know the potential causes? Thanks.

@ptrblck @albanD et al.

image

More mysterious, when I change the name from x to abc, the problem is solved…

I get the expected results running your code, but are you able to reproduce the issue by running your code in a new script file?

I insert the snippet at the beginning of my project, and only using the name x will get the problem. Sorry. I may not be able to compress it for reproducibility. But do you have any insights on it?