Grad_fn for a Classifier's Output

I am trying to obtain the results of a classifier written in sklearn and not available on torch. The outputs of this classifier will be used to create a loss function that alters the weights of a pytorch nn.Module. Since the outputs are not obtained using a pytorch operation, the weights of the pytorch module are not updated. Adding (requires_grad) to the resultant lossand removing detach() function did not change the output. A snippet of this function is as follows. Any help would be greatly appreciated.

Since you are using numpy for the loss calculation, you are detaching this tensor from the computation graph and Autograd won’t be able to calculate the gradients for the used parameters in your model.
You could either try to reimplement the numpy method in PyTorch or you could write a custom autograd.Function defining the forward and backward methods as described here using the numpy operations.

1 Like