Why Tensor.clone() is called clone and not copy?

Hi,

I think this is most likely misleading doc here :confused: The master doc has been updated and is clearer: https://pytorch.org/docs/master/generated/torch.clone.html?highlight=clone#torch.clone

The difference is that if you use copy_, the original value won’t get gradients. But for clone, there is no original value so not this issue.

y = torch.rand(10, requires_grad=True)

res = y.clone().copy_(x)
res.sum().backward()
assert (y.grad == 0).all()