How to multiply two tensors along selected dimensions

Dear altruists,

I have two tensors with dim (2, 4, 4)) and (4,4) respectively.

A = tensor([[[0, 0, 1, 0],
[0, 0, 1, 0],
[1, 1, 0, 1],
[0, 0, 1, 0]],
[[0, 0, 1, 0],
[0, 0, 1, 0],
[1, 1, 0, 1],
[0, 0, 1, 0]]])

B = tensor([[0, 1, 1, 0],
[1, 0, 1, 0],
[1, 1, 0, 1],
[0, 0, 1, 0]])

I want to do multiplication along second and third axes. My desired output is a vector with dim 2.

I am trying to take advantage of the speed of torch.cuda() for matrix multiplication. Is there an elegant way to solve this problem?

Thanks in advance!

Hi Nil!

It’s not clear to me what you mean by this.

One guess is that you would like to perform a so-called contraction on
the last two dimensions of your two tensors. If so, torch.einsum()
might be the simplest approach:

>>> import torch
>>> torch.__version__
'2.2.0'
>>> A = torch.tensor([[[0, 0, 1, 0],
... [0, 0, 1, 0],
... [1, 1, 0, 1],
... [0, 0, 1, 0]],
... [[0, 0, 1, 0],
... [0, 0, 1, 0],
... [1, 1, 0, 1],
... [0, 0, 1, 0]]])
>>> B = torch.tensor([[0, 1, 1, 0],
... [1, 0, 1, 0],
... [1, 1, 0, 1],
... [0, 0, 1, 0]])
>>> torch.einsum ('ijk, jk -> i', A, B)
tensor([6, 6])

einsum() is relatively mature and well optimized so I would expect it to take
nearly full advantage of the performance of a gpu.

(If this is not what you are trying to do, please post a script that implements
your desired computation – using for loops, if that’s what works for you – to
make clear specifically what you want.)

Best.

K. Frank

Thanks K. Frank. I apologize for my late response. I solved the problem.

I added a dimension at the beginning of B to make it (1, 4, 4). Then I multiplied A and B. My output had dimension (2, 4, 4). I summed along second and third axis and I got final output vector of dim 2.