C++ backward function

If I want to write a operation by C++ extension, is it necessary to write a corresponding backward function for that operation even if there’re no trainable weights in that operation?

I have dug into the source and found that every built-in loss function comes with a backward function. i.e. the binary_crosss_entropy_with_logits,and the backward function is binary_cross_entropy_with_logits_backward

if you are using torch::Tensor as input to your C++ function, and you are only doing PyTorch operations on it, no backward function is necessary (just like in Python you dont write one).
However, if you are doing some custom calls, such as going into OpenCV and editing the contents, then custom backward call is necessary, as we dont know how to differentiate through those calls.

1 Like

Thank you @smth, really helpful.
So according to this, is the backward function of BCE unnecessary? Because it doesn’t call any fancy functions which can not be differentiate.

yes, but make sure you are using an input of torch::Tensor, and not at::Tensor. torch::Tensor is autograd-ready

2 Likes