How to implement 4D tensor multiplication?

Given A: [B, N, K, K], B: [B, S, K, K].
The shape of expected matrix multiplication result: [B, N, S, K, K].
How to use torch.matmul to get this result?

Hi,

Assuming that you want to reduce dimension -1 of A and dimension -2 of B,
you can do the following so that the batching works fine:

torch.matmul(A.unsqueeze(3), B.unsqueeze(2))

Hi,

I have tried your solution. But I met some errors. I use the code below.

a = torch.rand(2, 8, 3, 3)
b = torch.rand(2, 4, 3, 3)
ans = torch.matmul(a.unsqueeze(3), b.unsqueeze(2))

And I got the error:

ans = torch.matmul(a.unsqueeze(3), b.unsqueeze(2))
RuntimeError: The size of tensor a (8) must match the size of tensor b (4) at non-singleton dimension 1

I know the problem. The following code works.

torch.matmul(A.unsqueeze(2), B.unsqueeze(1))

I’m curious about how it works. Could you please explain the reason?

Thanks.

This function may be help you
torch.einsum
It multiplys two matrixs of any dimension.
torch.einsum

Yeah, I have tried torch.einsum and got the same results.

torch.einsum('abik, ackj -> abcij', A, B)

Thanks.

Ho my bad I miscounted the dimensions.

What the unsqueeze does is to make the sizes 2, 1, 8, 3, 3 and 2, 4, 1, 3, 3. So that matmul can broadcast on these two dimensions of size 1 and do the matrix product you want.

Got it. Thanks for your explanation.