Autograd reference lost during cost calculation

Hello guys,

My cost function is very specific and needs a lot of external calculations, including using external libraries, to get the cost value. Because of this, it is impossible to keep such calculations under tensors. The problem with getting out of the tensor context is that I lose track of the autograd history. How could I take the scalar value produced by my cost function, encapsulate it in a tensor, and reconnect it to the autograd graph?

You can do that by defining a custom autograd.Function. This allows you to do anything you want in the forward. The caveat is that you have to give the derivative yourself in the backward (as PyTorch cannot be aware of it when you are not using tensors).

Best regards

Thomas

Hello Thomas, thank you very much for the reply. I will test here and as soon as possible share the result.