Weird indexing tensors using tensors

Hi,

I am finding a weird behavior using tensors for indexing, so wanted some insight.
x=torch.tensor([[1,2],[3,4],[5,6]])

Now,

x[torch.tensor([[0],[1]])]
tensor([[[ 1,  2]],
        [[ 3,  4]]])

but what I expect is as

x[[[0],[1]]]
tensor([ 2])

How can I get this result using tensor indexing. Thank you.

According to My thinking

x[torch.tensor([[0],[1]])] == x[tensor([[0],[1]]) = x[[0,1]]=
tensor([[[1, 2]],
        [[3, 4]]])

This is because the torch considers the subscript as x[x-dimension][y-dim] which in our case is  x[[[0],[1]] which is still x-dimension with no y-dimension x[[0],[1]][No-y-dimension]

so it will return those values which are columns [1,2][3,4]
1 Like

Thanks @jmandivarapu1.
Can I use indexing by tensors to get the same result as when indexing by list, i.e. like x[[[0],[1]]].