# How to broadcast a 1D tensor with a 4D tensor?

Suppose the size of a is (128,) and the size of b is (128, 3, 64, 64). How to do broadcast multiplication with these two tensors?

2 Likes

I guess u need to convert tensor A to size (128,1) and then multiply with tensor B.

``````output  = A. unsqueeze(1) * B
``````

@kelam_goutam Thanks for your reply. It works if the size of b is (128,3), i.e, b is 2-dimensional tensor, but doesn’t work when the dimension of b is more than 2 dimensions.

1 Like

the most generic way would be to do something like this (which is not broadcasting, but explicit expanding):

``````output = a.expand_as(b)*b
``````

Edit:
if you want to do it with broadcasting, you can simply do

``````output = a * b
``````

if the shapes cannot be broadcasted, an error will be raised, but you will also receive this error if you use the explicit approach

1 Like

Thanks for your apply. I tried the solution but it does not work as I expected.
A related question. Suppose I have a = torch.tensor([1,2,3]), how to expand it to

tensor([[[ 1, 1, 1],
[ 1, 1, 1]],

``````       [[ 2,  2,  2],
[ 2,  2,  2]],

[[ 3,  3,  3],
[ 3,  3,  3]]])
``````

Thanks!

A crude way to solve your problem could be the extension of my previous solution.

``````output = A.unsqueeze(1).unsqueeze(2).unsqueeze(3) * B
``````

This basically makes A a 4D tensor.

1 Like

@kelam_goutam Thanks!

These also work:

``````output = a.reshape(128, 1, 1, 1) * b
``````

or

``````output = a.reshape(-1, 1, 1, 1) * b
``````
1 Like

You can broadcast a vector to a higher dimensional tensor like so:

``````def row_mult(t,vec):
extra_dims = (1,)*(t.dim()-1)
return t * vec.view(-1, *extra_dims)
``````
1 Like