Assiging a non contiguous slice of a tensor to another tensor

I have a two dimensional tensor A and a list of indices b. I want to assign the sub-matrix defined by the indices in b to some other values, contains in a tensor c.

If the indices b are contiguous, this is easy. For example suppose b contains indices from 0 to 3, then I can do:

>>> a = torch.zeros(5, 5)
>>> c = torch.ones(3, 3)
>>> a[0:3, 0:3] = c
>>> a

 1  1  1  0  0
 1  1  1  0  0
 1  1  1  0  0
 0  0  0  0  0
 0  0  0  0  0
[torch.FloatTensor of size 5x5]

which is what I want. But what if the indices in b are not contiguous? That is, what is b = [0, 2, 4] or b=[0,2,3]?
Doing a[b, b] returns something that is not the sub-matrix I thought. Right now I am doing

>>> b = [0, 2, 4]
>>> for i in range(len(b)):
...  for j in range(len(b)):
...   a[b[i], b[j]] = c[i, j]
... 
>>> a

 1  0  1  0  1
 0  0  0  0  0
 1  0  1  0  1
 0  0  0  0  0
 1  0  1  0  1
[torch.FloatTensor of size 5x5]

But is way too slow. Is there a better way?

2 Likes

The the indices evenly spaced?
If so, you could try:

a = torch.zeros(5, 5)
a[::2, ::2] = 1
print(a)
> tensor([[ 1.,  0.,  1.,  0.,  1.],
          [ 0.,  0.,  0.,  0.,  0.],
          [ 1.,  0.,  1.,  0.,  1.],
          [ 0.,  0.,  0.,  0.,  0.],
          [ 1.,  0.,  1.,  0.,  1.]])

Hi,

no they are not - they are random indices

How about index_copy_, index_put_ or scatter? They all take index copying of data.

If I am not mistaken, index_copy only works with one dimension. Here I want to slice across two dimensions :slight_smile:

You could flatten the tensor and recalculate the indices and afterwards reshape it again

So if you want to select the submatrix given rows and columns, with possibly non continuous indexes and not evenly spaced, the clearest way I found is generating the indexes using np.ix_.

sub_matrix = matrix[np.ix_(wanted_rows, wanted_columns)]

Sorry for bump, I think I found a good solution and this poped as first google result.

5 Likes