Just wondering if this interaction with indexing with tensors as opposed to with lists is intended.

Lists

a = [1,1,1]
b = a[1]
b -= 1

a will still be [1,1,1] and b will be 0
On the other hand, for tensors,

a = torch.ones(3)
b = a[1]
b -= 1

a will be tensor([ 1., 0., 1.]) and b will be tensor(0.).

Is this the intended behavior? If not, how can i modify it such that i copy the tensor as a brand new variable? For python list, indexing is enough to copy it as a new variable