Element wise sum of batched tensors

I have a list of tensors t_list and I need to get the element wise sum. I am concerned that because the tensors are in a batch that my method is incorrect.

Code example

t_list = [t1, t2, t3, t4] #where ti is a tensor 32 x 1 x 128
t_list = torch.stack(t_list) # giving 4 x 32 x 1 x 128
sum_list = sum(t_list) # result is 1 x 32 x 1 x 128

Is this the correct way to sum each list within the batch or would I need to specify dimensions for sum?

Your approach might work, but if you want to set a dim argument you could use torch.sum(t_list, dim=0).

1 Like