I got this error during a call to
Traceback (most recent call last): File "prova.py", line 60, in <module> loss.backward() File "/home/simone/anaconda3/lib/python3.6/site-packages/torch/autograd/variable.py", line 156, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables) File "/home/simone/anaconda3/lib/python3.6/site-packages/torch/autograd/__init__.py", line 98, in backward variables, grad_variables, retain_graph) File "/home/simone/anaconda3/lib/python3.6/site-packages/torch/autograd/function.py", line 91, in apply return self._forward_cls.backward(self, *args) File "/home/simone/anaconda3/lib/python3.6/site-packages/torch/autograd/_functions/reduce.py", line 26, in backward return grad_output.expand(ctx.input_size), None, None File "/home/simone/anaconda3/lib/python3.6/site-packages/torch/autograd/variable.py", line 722, in expand return Expand.apply(self, sizes) File "/home/simone/anaconda3/lib/python3.6/site-packages/torch/autograd/_functions/tensor.py", line 111, in forward result = i.expand(*new_size) RuntimeError: invalid argument 1: the number of sizes provided must be greater or equal to the number of dimensions in the tensor at /opt/conda/conda-bld/pytorch_1503970438496/work/torch/lib/TH/generic/THTensor.c:298
I’m using the Pytorch 0.2.0 package from Conda. Here is a short example i wrote which should reproduce the error. Basically I have a model which takes two vectors and computes some kind of loss. During training, at a certain point (after ~100 iterations with Adam and some more with SGD) I get that error. In the script I’m providing, I removed the training part, and it should give the error at the first iteration. Additionally, I put in the script the state of the model at a point where it should give the error anyway: just change
if False to
if True at line 45.
I noticed that the error only occurs when the final loss is computed as at line 34 or at line 35 (which is currently commented out), and not at line 33, i.e., only when the
margin variable is indexed by itself.