Tensor indexing

Just wondering if this interaction with indexing with tensors as opposed to with lists is intended.

Lists

a = [1,1,1]
b = a[1]
b -= 1

a will still be [1,1,1] and b will be 0
On the other hand, for tensors,

a = torch.ones(3)
b = a[1]
b -= 1

a will be tensor([ 1., 0., 1.]) and b will be tensor(0.).

Is this the intended behavior? If not, how can i modify it such that i copy the tensor as a brand new variable? For python list, indexing is enough to copy it as a new variable

It’s intended behavior.
If you want to get a copy, you could use .clone():

a = torch.ones(3)
b = a[1].clone()
b -= 1
1 Like

Just curious, in resnet, they did not use clone() resnet

wouldnt the subsequent operations modify residual after passing x through the layers as well? shouldnt residual be the original input x?

residual and x won’t be modified by the operations, as new tensors are being created.
Have a look at this small example:

lin = nn.Linear(10, 2)
x = torch.ones(1, 10)
residual = x
output = lin(x)

Both, residual and x have still the same shape and values.

1 Like