Hi, everyone
Is there any way to use autograd mechanism in C++ extension?
Specifically, I am going to write a custom operator in C++, which might contain some off-the-shelf functions, for example, at::softmax
. I do not want to manually re-implement the backward process of softmax.
I describe my requirement in the following pseudo-code:
at::Tensor forward(at::Tensor input){
auto a = custom_func(input);
auto b = at::softmax(a);
save_for_backward_somehow(a, b);
return b;
}
at::Tensor backward(at::Tensor out_grad){
auto a, b = saved_tensor();
b.backward(out_grad); // avoid re-implementing the backward of softmax
auto grad_of_a = a.grad; // after calling backward on b, we can get the gradient of a somehow.
return get_input_grad(grad_of_a);
}
I will appreciate any comments and suggestions.