Hello @vvanirudh,
Scatter allows you to index the target of the assignment. You would need to have an index tensor that is of the same shape as the source tensor. The index tensor will give one of the target dimensions while the others come from the original.
So if you take the example from the docs
# some data
orig = torch.rand(2, 5)
# where we want to put the data
# notice: idx.size() is equal orig.size()
# idx will be dimension zero index of the target))
idx = torch.LongTensor([[0, 1, 2, 0, 0], [2, 0, 0, 1, 2]])
# notice: t1.size(1) is equal orig.size(1)
t1 = torch.zeros(3, 5).scatter_(0, idx, orig)
print(t1)
the scatter assigning t1 will give you the same as the following explicit loop:
# zeros of the right size
t2 = torch.zeros(3,5)
# loop over indices of orig (and idx)
for x in range(orig.size(0)):
for y in range(orig.size(1)):
# assign to t2 at the place given by idx in the first dimension, and y in the second
t2[idx[x,y],y] = orig[x,y]
print(t2)
gather is the counterpart, i.e. with
copy = t2.gather(0, idx)
print(copy)
copy will be the same as orig. You could thus spell out gather as
copy2 = torch.zeros(*idx.size())
for x in range(idx.size(0)):
for y in range(idx.size(1)):
copy2[x,y] = t2[idx[x,y],y]
print(copy2)
index_select, on the other hand, will not select single elements, but rather entire rows, columns, or whatever dimension (similar to passing a list in numpy):
row_idx = torch.LongTensor([0,2])
zeroandthirdrow = t2.index_select(0, row_idx)
print(zeroandthirdrow)
this could be spelled out as
zeroandthirdrow2 = torch.zeros(row_idx.size(0), t2.size(1))
for x in range(row_idx.size(0)):
for y in range(t2.size(1)):
zeroandthirdrow2[x,y] = t2[row_idx[x], y]
print (zeroandthirdrow2)
or in intermediate compactness with row assignments instead of the y loop:
zeroandthirdrow3 = torch.zeros(row_idx.size(0), t2.size(1))
for x in range(row_idx.size(0)):
zeroandthirdrow3[x,:] = t2[row_idx[x],:]
print (zeroandthirdrow3)
I must admit that I usually use shapes and spelling out indices (on paper instead of in for loops), while others seem to have fancy pictures.
Best regards
Thomas