# How to broadcast 2D Tensor over 4D Tensor?

Hi All,

I’m trying to broadcast a 2D Tensor over a 4D Tensor and I’m not 100% how to do it. Let’s say I have two tensors, `mat1` of size `[B, D]` and another Tensor `mat2` of size `[B, D, N, N]`. How could I broadcast `mat1` over dim 2 and 3 of `mat2`?

``````mat1 = torch.randn(1, 4)
mat2 = torch.randn(1,4,2,2) #B=1, D=4, N=2
mat1*mat2 #throws errror
``````
``````RuntimeError: The size of tensor a (2) must match the size of tensor b (4) at non-singleton dimension 3
``````

The one thing I have done is use `torch.unsqueeze` to force a broadcasting but another problem arises during the backward pass as I’m changing the size of the Tensor!

Any help is appreciated!

Broadcasting starts with the rightmost indices and can succeed if and only if the dimensions are equal, one of them is 1 or it does not exist. You could either permute the dims of the matrix like this

``````b = 1
d = 4
n = 2
mat1 = torch.randn((1, 4))
mat2 = torch.randn((1, 4, 2, 2)).permute(2, 3, 0 ,1) # swap dims 0 with 2 and 1 with 3

mat3=mat1*mat2
mat3=permute(2, 3, 0, 1) # swap back the dims

mat3.shape
torch.Size([1, 4, 2, 2])
``````

Or you can multiply using the einsum notation, like this:

``````b = 1
d = 4
n = 2
mat1 = torch.randn((1, 4))
mat2 = torch.randn((1, 4, 2, 2))

mat3 = torch.einsum('bd, bdij -> bdij', mat1, mat2).

mat3.shape
torch.Size([1, 4, 2, 2])
``````
1 Like