# Element-wise multiplication of a vector and a matrix

I’m searching the net for a multiplication which is applied between an 1-d tensor anf an n-d tensor.
Expected behaviour:

``````A = torch.tensor(np.array([[[1., 0., 0.],
[0., 1., 0.],
[0., 0., 1.]],
[[2., 0., 0.],
[0., 2., 0.],
[0., 0., 2.]]]))

v = torch.tensor(np.array([2.0, 1.0]))

torch.element_wise(v, A)
``````

expected result:

``````tensor([[[2., 0., 0.],
[0., 2., 0.],
[0., 0., 2.]],
[[2., 0., 0.],
[0., 2., 0.],
[0., 0., 2.]]])
``````

I found a solution, but I wonder if there exist any simpler solution?

``````def ev(a, b):
N = b.shape[0]
shp = b.shape[1:]

R = ev(v, A) # gives the right answer
``````

You can do the following:

``````v.view(-1, 1, 1).expand_as(A) * A
``````

Note that the automatic broadcasting can take care of the expand and so you can simply do:

``````v.view(-1, 1, 1) * A
``````
2 Likes

That works as well
I usually prefer the version with functions compared to advance indexing because I know when I get a view of the data and when I get a copy.