Tensor division in batches


I am trying to calculate the mean of a group of tensors all at once. I was wondering if someone knows how to do this without using a for loop. My code looks like this

main_tensor.shape   = torch.Size([20, 100])
length_tensor.shape = torch.Size([20])

so for each of my 20 samples I am trying to divide each of its 100 elements by its length, which is stored in the length_tensor


1 Like

main_tensor / length_tensor.view(20, 1) should do the trick.

You’d like to divide each element of main_tensor[i] by length_tensor[i]. The key is broadcasting length_tensor to match the size of main_tensor so that we can do element-wise division.