# How to broadcast in arbitrary dimensions?

By default, broadcasting works in the last dimension, but I want to make broadcasting work in an arbitrary dimension.
For example, in a matrix of 4 and 1 dimensions, the broadcast multiplication for the first dimension can be written as
torch.einsum(“ijkl,i->ijkl”, X, Y)

But is there a suitable function that broadcasts in the n( 1<=n<=K )th dimension in a matrix of K and 1 dimensions?
(The broadcast behavior I want is enough to behave like `view(1, 1, -1, 1, ..., 1)` )

Hi @tsp_t,

If you’re trying to multiply across all dimensions you can just use `*`, if you get a mismatch on dimensions error, you can just unsqueeze your 2nd tensor so that it has the same shape as the 1st tensor. For example,

``````x1 = torch.randn(4,)
x2 = torch.randn(4, 5)

x1 * x2 #fails
#RuntimeError: The size of tensor a (4) must match the size of tensor b (5) at non-singleton dimension 1
``````

However,

``````x1[:, None] * x2  #works, returns tensor of shape [4,5]
``````

In your case, you just need to unsqueeze to a tensor of size 4 and multiply, i.e. `x1[:, None, None, None] * x2`.