RuntimeError: sizes do not match at /py/conda-bld/pytorch_1493680494901/work/torch/lib/THC/generated/../generic/

How can i debug this?

File “”, line 48, in train
loss.backward() # compute the gradients
File “/home/user/anaconda3/lib/python3.6/site-packages/torch/autograd/”, line 146, in backward
self._execution_engine.run_backward((self,), (gradient,), retain_variables)
RuntimeError: sizes do not match at /py/conda-bld/pytorch_1493680494901/work/torch/lib/THC/generated/…/generic/

My loss function is really complex and i think the problem might be when i do abs() or exp() operations, but i can’t find out where and the error doesn’t say much. I have a conda installed release pytorch.

It says you have two tensors whose size doesnt match. For example, you cant add two tensors if their sizes dont match.

What I usually do in such cases is spinkle print statements liberally just before the line in question, printing the .size() for the input tensors of the operaiton.

If you have a debugger, that might be faster. I never figured out how to use a python debugger yet :stuck_out_tongue: . Or perhaps I just find print sufficient/ok for me. Unclear…