Is it possible to use Eigen lib manipulate the tensor inplace without detach the tensor from autograd graph?

Hi there,

Is it possible to use Eigen or std libs in C++ to do some operation on a torch::Tensor without letting it detached from the autograd graph? Or I’ll need to rewrite all the operation using torch Tensor?

Thanks!

Hi,

If you want your Eigen/std lib operations recorded, you’ll want to either create a custom autograd function, or rewrite the operation using pytorch ops.

If you don’t want the operations recorded, but still modify the Tensor inplace, I’m curious what the use case what be since the computed gradients would be wrong.

Yeah…I want them to be recorded… so just want to confirm I’ll need to rewrite the whole thing before doing the work haha

1 Like

Hey,

Unfortunately autograd is not magic. It works with all pytorch ops because we wrote manually all the formulas.
So for it to work here you need to either: