Using `backward()` of custom autograd function with loss.backward()

Hi community, I wanted to create a custom activation function which requires custom autograd function (something similar to MyRelu example here. When I apply it, the forward() is called and its definition is used to compute activation but when I use loss.backward(), the backward() defined is not called. Even in the MyRelu example (attached above) doesn’t seem to utilize the backward() function.
I’m not sure which part I’m understanding wrongly. Any insights and corrections would be much appreciated. Thanks in advance :slight_smile:

How did you check, if the backward was called or not?
If I add a simple print statement into the backward method, it gets printed once loss.backward() is called.

Note that if your forward is not doing anything (returns the input or a view of the input), this might happen if you do inplace changes on the output.

Otherwise, I would follow @ptrblck advice and double check how you test this.