Einsum but works in numpy

Hi Folks,

A shape (80, 513)
B shape (2, 80, 501)

in Numpy it does work but in torch.einsum it doesnt.

x=np.einsum("fm,...mt->...ft", A, B, optimize=True)
x = shape (2, 513, 501)

But in torch I’m getting
einsum(): operands do not broadcast with remapped shapes [original->remapped]: [80, 513]->[1, 80, 1, 513] [2, 80, 501]->[2, 1, 501, 80]

I can’t figure what the is difference ?