# Replace tensor values at indexes

Hi folks!

I have been going through Pytorch documentation in search for a way to do an efficient per-index replacement of values inside a tensor. The problem:

dest - 4D tensor (N, H, W, C1) that I want to update
idxs - 4D tensor of indexes I want to access (N, H, W, C2)
source - 4D tensor of values I want to put in z at idxs (N, H, W, C2).

In practice dest a minibatch of per-pixel probabilities, where C1 is a number of distinct categories and C2 is top 5 updates from source. Note that C1 != C2.

I have been looking at torch.copy_index_ but I can’t seem to find a way to broadcast it along N, H, W dimensions.

My (working) solution with a for loop:

``````def col_wise_replace(dest, idxs, source):
''' replace probability stacks '''
for k in range(n):
for i in range(h):
for j in range(w):
dest[k,i,j].index_copy_(0, idxs[k,i,j], source[k,i,j])
``````

This solution is slow. Any numpy / torch-like way I can achieve the same?

Hi,

`torch.scatter()` is doing exaxtely that: `dest.scatter_(-1, idxs, source)`.

@albanD

I am wondering how can I make use of `torch.scatter_()` or any other built-in masking function for this one-hot masking task-

I have two tensors-

`X = [batch, 100] and label = [batch]`

num_classes = 10

so each label has 10 tensors out of those 100 tensors in ‘X’.
For instance X of shape [1x100],

``````X = ([ 0.0468, -1.7434, -1.0217, -0.0724, -0.5169, -1.7318, -0.1207, -0.8377,
-0.8055,  0.7438,  0.1139,  1.2162, -1.7950,  1.7416, -1.2031, -1.4833,
-0.5454,  0.2466, -1.2303, -0.4257,  0.9873, -1.5905, -1.3950,  0.4013,
-1.0523,  1.4450,  0.6574,  1.5239, -0.3503, -0.1114,  1.8192, -1.7425,
0.4678,  0.4074,  1.7606, -1.0502,  0.0724,  0.1721,  0.1108,  0.4453,
0.2278, -1.5352, -0.1232,  1.1052,  0.2496,  1.2898, -0.4167, -0.8211,
0.2340, -0.3829, -0.1328,  0.1033,  2.8693, -0.8802, -0.0433,  0.5335,
0.0662,  0.4250,  0.2353, -0.1590,  0.0865,  0.6519, -0.2242,  1.5300,
1.7021, -0.9451,  0.5845, -0.7309,  0.7124,  0.6544, -1.4426, -0.1859,
-1.5313, -1.5391, -0.2138, -1.0203,  0.6678,  1.3445, -1.3453,  0.5222,
0.9510,  0.0969, -0.5437, -0.2727, -0.6090, -2.9624,  0.4578,  0.5257,
-0.2866,  0.0818, -1.2454,  1.6511,  0.1634,  1.3720, -0.4222,  0.5347,
0.3586, -0.3506,  2.6866,  0.5084])

label = [3]
``````

I would like to do one-hot masking of “1” to tensors 30-40 and rest all the tensors as “0” on the tensor ‘X’.

so for,
label = 1 -> mask (0 to 10) as ‘1’ and rest as ‘0’
label = 2 -> mask (10 to 20) as ‘1’ and rest as ‘0’
… so on.

Hi,

You mist likely want to view x as a Tensor of size `[batch, 10, 10]` before doing the masking to make things clearer.
Then you can full in the values with 0s and 1s as needed.

But I want to retain the shape of `X`. Will it still be possible?

You can reshape it back when you’re done
Or even better don’t reshape it inplace and just do something like:

``````x = XXXX

x_view = x.view(batch, 10, 10)
# fill the values corresponding to label idx with 1s
x_view.select(1, label_idx) = 1

# Now you can use x that was changed by the inplace op.
``````

@albanD
The second line gives an error, btw I added an example at the end of my post because I am not able to relate how this will give me the masking I am trying to get.

``````>>> w_view.select(1, label) = 1
File "<stdin>", line 1
SyntaxError: can't assign to function call
``````

Ho right I forgot this was not valid, my bad. You can do `w_view.select(1, label).fill_(1)`.

But for your case, I guess what you want is:

``````labels = # Tensor or list of batch integers

res = torch.zeros(batch, 10, 10)

for label in labels:
res.select(1, label).fill_(1)

res = res.view(batch, 100)

``````

@albanD

Thank you for the solution. Although, it still gives me masking of all the labels in all the vectors. Instead, one label should reflect in one vector!

``````label = ([0, 7])

tensor([[1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])

``````

I am not sure what you mean.
Can you share an example with input tensors and what you expect as output?

@albanD I found another similar way to do it;
``````    for i in range(batch_size):