The problem I am solving does not have an explicit loss term. I need to calculate the gradient and update the weights manually. That means I cannot use loss.backward() function; since the loss term does not even exist.
How can I calculate gradient w.r.t the Neural Network parameters ? The Neural Network is of type torch.nn.Module.
I am using this approach to calculate the gradient manually.
gradient = torch.autograd.grad(term,NN.parameters()) ; here term can be a scaler or vector
However, the shape of the gradient that I receive is unexpected.