I am trying to run a custom loss. All native losses from pytorch run well, but not this one.
My loss takes all points predicted and then selects just some of these points and does the same with the mask. This is one of the losses I am trying to implement. This is a segmentation mask.
During this process, the tensors shape gets reduced from say 100 points in the prediction to say, 10 points and then uses these 10 points to get the loss.
Is there a way to solve this?
Getting Runtime error: element 0 of tensors does not require grad and does not have a grad_fn
File "C:\Users\LJ\anaconda3\envs\toolkit\lib\site-packages\torch\_tensor.py", line 396, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs) File "C:\Users\LJ\anaconda3\envs\toolkit\lib\site-packages\torch\autograd\__init__.py", line 173, in backward Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn