I am writing my own loss function. I find that in version 0.40, the breakpoint doesn’t work in backward(self, grad_output). It is very difficult to debug. Any suggestion, please? Thank you!
It is hard to tell without seeing the function. Are you writing an autograd Function or a loss Module / a python function?
If it is the latter, you could add a backward hook and set a breakpoint in that.
Thank you so much for your quick reply, even in Sunday!
Can you give an example? how to use the backward hook?
My original problem is
I appreciate if you could help!