How to perform dot product along a dimension of the same batch?

I have a batch = 256 of pair of vectors of dimension 32. The shape is so [256, 2, 32].

What I would want would be to perform the dot product between each of the pair of vectors of the batch, so that I end up with a tensor of shape [256, 32].

How should I do it ?

Thank you in advance!

I don’t understand how you wish to end up with such dimensions. If one vector is X[:,0,:] and another is X[:,1,:], and you want to dot product them, the result should be a either a scalar, or a vector of length 256 or a vector of length 32 (if you want to perform dot product in one dimension).

Maybe you want elementwise product?
In that case it can be achieved by
torch.prod(X, dim=1)

Hi Paula (and Alex)!

Building on Alex’s suggestion, I believe that

X.prod (dim = 1).sum (dim = 1)

should do what you want.

Best.

K. Frank

Yes, thank you both! You solved my doubts.

Best,

Paula