Given these tensors:
batch_size = 5 k=3 F = torch.randn(k+1, 3, 3) B = torch.randn(batch_size, k, 1)
I’m trying to perform the following operation but without using the for loop (with is pretty slow when A and B are huge tensors):
S=F[0,:,:]*B[:,0:1,0:1] for i in range(1,F.shape-1): S += F[i,:,:]*B[:,i:(i+1),0:1]
Is it possible to do this without using the for loop? I am aware of the broadcasting capabilities of Pytorch, but I am not sure how I could apply it to this example.