I have a tensor X and defined Y = X.detach() or Y = X.data. According to the documentation, Y is a new tensor. But when I change the value of Y, X is also changed. This is strange. The same thing happens even if I define Y = X.detach().numpy() or Y = X.data.numpy(). Is it a bug?

I had the same issue. I guess with detach() function creates Tensor Y as the new alias of tensor X with `requires_grad = False.`

I solved this by `Z = Y.clone()`

. Now changing the Z values won’t change X values.

Thanks for confirming this. I also used clone() to solve this. But it waste my much time and work before I find it… Really misled by documentation.