Loss results processed by other program

Hi everyone,

I have a question for autograd backpropagation. I want ask if my loss results is calculated from other program for example c++ and return to pytorch, can autograd still working? If not how can I deal with it?

It depends on. Pytorch is c++ based and ofc you can code whatever in c++.
The point is you need to provide a context variable which is the core of autograd. In short, you will have to provide loss and gradients in order to make it work.

1 Like

Hi, thank you for your help. It is really helpful :grinning:
But I’m not very clear what should I do now, if it is possible, do you know any program use this method? Can you share me some links?

Well, pytorch provides a way to do so. https://pytorch.org/tutorials/advanced/c_extension.html, however I never did it, thus can’t not recommend a more developed link, Pretty sure you can sucessfully google it :slight_smile: