I want to index a two-dimension tensor like I do in numpy.

a = torch.rand(5,5)
b = b = torch.LongTensor([4,3,2,1,0])

But typing a[b,b] gives an error:

TypeError: indexing a tensor with an object of type LongTensor. The only supported types are integers, slices, numpy scalars and torch.LongTensor or torch.ByteTensor as the only argument.

How do I index a two-dimension tensor using other two tensors?

Thanks for your reply. But this still gives the same error:

>>> a[b,:][:,b]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: indexing a tensor with an object of type LongTensor. The only supported types are integers, slices, numpy scalars and torch.LongTensor or torch.ByteTensor as the only argument.

The answer you gave is not correct .
a = torch.rand(5,5)
b = b = torch.LongTensor([4,3,2,1,0])
a[b, :][:, b]
Traceback (most recent call last):
File “”, line 1, in
TypeError: indexing a tensor with an object of type LongTensor. The only supported types are integers, slices, numpy scalars and torch.LongTensor or torch.ByteTe

TypeError: indexing a tensor with an object of type torch.LongTensor. The only supported types are integers, slices, numpy scalars and torch.LongTensor or torch.ByteTensor as the only argument.

_oldgetitem = torch.FloatTensor.__getitem__
def _getitem(self, slice_):
if type(slice_) is tuple and torch.LongTensor in [type(x) for x in slice_]:
i = [j for j, ix in enumerate(slice_)
if type(ix) == torch.LongTensor][0]
return self.transpose(0, i)[slice_[i]].transpose(i, 0)
else:
return _oldgetitem(self, slice_ )
torch.FloatTensor.__getitem__ = _getitem

Then x[b, :][:, b] works as expected. Maybe a bit of a hack though.

Regarding x[b, :][:, b] vs. x[b, b] – this is something which has never been solved for numpy arrays.