Custom function in Pytorch using outside libraries

Hello. I want to create a neural network that has a loss function coming from two sources: the output of the neural network, as well as a function applied to this output. Basically, this function takes in values that are coming from the neural network, and produces a set of new values. I want to simultaneously take into account losses coming directly from the output of the neural network AND the output of the function given this prediction, to generate a prediction that gives me the best function output.

For instance:

Loss = MSE(prediction, target) + MSE(function(pred), function(target))

The important thing to note is that this function is coming from an external python library, and it is not something I can recreate in Pytorch. Would this work? What can I do so that it works as a Pytorch function?

You can create a custom autograd.Function and implement the forward as well as the backward for this operation using the 3rd party library as seen here.

1 Like

Thank you! How about if the function was not known? Basically, I’m going to pass some input to a function from an external python library, and it will give me an output. I’m not entirely sure what I would put in the backwards method. Would it just be the gradient of the loss with respect to the output?

In that case I wouldn’t know if using this method in the forward pass would make sense.
Your suggestion of returning a static gradient might technically work, but since it wouldn’t reflect the true gradient I don’t know what your model would learn from it.