How can i debug this?
File “training.py”, line 48, in train
loss.backward() # compute the gradients
File “/home/user/anaconda3/lib/python3.6/site-packages/torch/autograd/variable.py”, line 146, in backward
self._execution_engine.run_backward((self,), (gradient,), retain_variables)
RuntimeError: sizes do not match at /py/conda-bld/pytorch_1493680494901/work/torch/lib/THC/generated/…/generic/THCTensorMathPointwise.cu:216
My loss function is really complex and i think the problem might be when i do abs() or exp() operations, but i can’t find out where and the error doesn’t say much. I have a conda installed release pytorch.