Alternative to broadcasting

I am a bit confused about the rules of broadcasting the shapes.

Is there any safer alternative? For example, if I have a matrix of dimension p times q. How can I duplicate it r times and make it a tensor of shape p times q times r?

EDIT: specifically, the confusion I had is the following:

I have the following piece of code

print(reshape_q.size())
print(mean_p.size())
diff = -1.0*mean_p + reshape_q

but it gives the me the following error
torch.Size([700, 100, 1])
torch.Size([700, 100, 3])
Traceback (most recent call last):
File “main.py”, line 238, in
train()
File “main.py”, line 161, in train
loss, first, kld = loss_function(decoded, targets, mean_output, logvar_output, z)
File “main.py”, line 230, in loss_function
KLD = KL_G_vs_GMM(mu.view(shape), logvar.view(shape), z.view(shape), reference_means)
File “main.py”, line 209, in KL_G_vs_GMM
diff = -1.0*mean_p + reshape_q
File “/public/apps/anaconda3/4.3.1/lib/python3.6/site-packages/torch/autograd/variable.py”, line 745, in add
return self.add(other)
File “/public/apps/anaconda3/4.3.1/lib/python3.6/site-packages/torch/autograd/variable.py”, line 283, in add
return self._add(other, False)
File “/public/apps/anaconda3/4.3.1/lib/python3.6/site-packages/torch/autograd/variable.py”, line 277, in _add
return Add(inplace)(self, other)
File “/public/apps/anaconda3/4.3.1/lib/python3.6/site-packages/torch/autograd/_functions/basic_ops.py”, line 20, in forward
return a.add(b)
RuntimeError: sizes do not match at /py/conda-bld/pytorch_1493680494901/work/torch/lib/THC/generated/…/generic/THCTensorMathPointwise.cu:216

What is your PyTorch version?

I guess that broadcasting is supported from version 0.2.0.

broadcasting only expands leading dimensions (not trailing dimensions)

What you can do is this:

p.unsqueeze(-1).expand_as(q)

@Tengyu_MA 's tensors have same number of dimensions though… So broadcasting in this case should be supported. (and it is indeed at least on master).

Yeah, that might be the issue! because I tried a synthetic case and it worked!

so the reason should be that I used the wrong version