Loss.backward() error when using custom autograd function

Hi All!

I am facing the error:
AttributeError: ‘CheckpointFunctionBackward’ object has no attribute ‘input_tensors’

Please find the implementation of my custom backward function.

Hi @torch2.0,

When sharing code, please copy and paste the code and wrap it within 3 backticks ```, rather than sharing a screenshot.

Your issue is that you haven’t saved the inputs_tensors object within the forward pass with ctx.save_for_backward(inputs_tensors), that’s why it doesn’t exist within the backward pass.

Also, I don’t believe you can invoke a torch.autograd.grad call within the backward method as it’s ran within a torch.no_grad() decorator (even if you enable gradient tracking via torch.enable_grad() yourself). You need to manually define the formula for the backward method yourself.

Hi @AlphaBetaGamma96,

                                      Thanks for your suggestion. I will make the corrections.