Using custom autograd function in C++ to connect non-pytorch model in loss function

Dear all,

I plan to use PyTorch’s C++ front-end for developing physics-informed neural network applications. While the network shall be based on libtorch, the physics model is implemented atop the Eigen library and has full algorithmic differentiation capabilities (forward and backward mode) using the open-source CoDiPack library.

From what I read at https://pytorch.org/tutorials/advanced/cpp_autograd.html#using-custom-autograd-function-in-c it should be possible wrap the external physics model in a custom autograd class that provides statically defined forward and backward methods in which (i) the connection with Eigen is made and (ii) the forward and backward propagation is continued using CoDiPack.

Does anyone have experience in connecting libtorch with an ‘external model’ via custom auto grad classes?

Thanks in advance,
Matthias