Broadcasting element wise multiplication in pytorch

I have a tensor in pytorch with size torch.Size([1443747, 128]). Let’s name it tensor A. In this tensor, 128 represents a batch size. I have another 1D tensor with size torch.Size([1443747]). Let’s call it B. I want to do element wise multiplication of B with A, such that B is multiplied with all 128 columns of tensor A (obviously in an element wise manner). In other words, I want to broadcast the element wise multiplication along dimension=1.
How can I achieve this in pytorch?

It I didn’t have a batch size involved in the tensor A (batch size = 1), then normal * operator would do the multiplication easily. A*B then would have generated resultant tensor of size torch.Size([1443747]). However, I don’t understand why pytorch is not broadcasting the tensor multiplication along dimension 1? Is there any way to do this?

What I want is, B should be multiplied with all 128 columns of A in an element wise manner. So, the resultant tensors’ size would be torch.Size([1443747, 128]).

Hello Mr. Knight!

Give B a dimension of size 1 using unsqueeze() so that it has a
dimension from which to broadcast:

B.unsqueeze (1) * A

Best.

K. Frank