Efficient Matrix multiplication

I have two matrices: A with size [D,N,M] and B with size[D,M,S]. My target is to multiply two matrices with respect to the dim_1 and dim_2, which is like A[d,:,:] * B[d,:,:] for d from 1:D. Currently I’m doing it with a for loop. I’m wondering if there is any other method I can use to make this operation more efficient.

Kevin

You might want to take a look at the docs for matmul: torch.matmul — PyTorch 1.10 documentation

>>> import torch
>>> a = torch.tensor([[[2, 2], [2, 2]], [[3, 3], [3, 3]]])
>>> b = torch.tensor([[[1, 0], [0, 1]], [[1, 0], [0, 1]]])
>>> torch.matmul(a, b)
tensor([[[2, 2],
         [2, 2]],

        [[3, 3],
         [3, 3]]])
>>>

Even more general matmul-style ops can be expressed with einsum:
https://pytorch.org/docs/stable/generated/torch.einsum.html

Thanks. Actually I just found this document which is super useful to me.