Batched index assign

I have a tensor

a = tensor([[1,1,1,1],[1,2,3,4]])
b = tensor([[0.1,0.2,0.3,0.4],[0.1,0.2,0.3,0.4]])

i want to fill these values in a tensor of dim 2 x 10[vocab size] like

tensor([[0,0.1+0.2+0.3+0.4,0,0,…],[0,0.1,0.2,0.3,0.4,0,0,…]])
basically fill corresponding values of tensor b into a new tensor indexed with a.

I cannot even do
i = 0
for a_,b_ in zip(a,b):
c[i][a_] += b_

As there are repeated indices as shown in the example. 1 occurs 4 times here , so this method just puts the last value 0.4 at the index of 1 , but i want 0.1+0.2+0.3+0.4 at the index of 1.

Is there any other method than having another for loop over the tensor. Also if i want to have access to the tensor for backprop how do i select these items, torch.index_select (tensor)? Will it keep it in the computational graph still? For example if i am interested in those particular b values which are tensors can include it in loss and call backward will it still backprop through the b_'s

[SOLVED using scatter_add_]