Imagine I have this vector:
x = torch.rand(10)
and I want to multiply it by this tensor:
y = torch.rand(10, 20, 30, 3)
I can’t do
x * y directly. I can’t do
x.expand_as(y) * y either, expand can apparently only stack dims on top of the current dim, and there will be a mismatch.
I can do
x.expand_as(y.T).T* y but it isn’t that elegant. Any other solution? Can I somehow make x from 1 dimension to 4 dimension (and more if y was more) without some very unreadable tricks?
I don’t know if this would be any more elegant, but you can let
broadcasting do the equivalent of the
expand() for you if you use
.T (transpose) to make the dimensions over which to broadcast be
the leading dimensions:
(x * y.T).T
Another problem with using
.T is that it’s deprecated for tensors, so there must be another way to do this, with at least the same readability?
At the cost of typing a few more characters you could use:
(x * y.transpose (0, -1)).transpose (0, -1)
.transpose (0, -1) don’t do exactly the same thing,
but they do both move the common dimension (in your case,
be the trailing dimension, which is all you need to get broadcasting to
work for this use case.