Backward hook not called on JIT functions

Hi All,

I’ve been working with trying to call a register_full_backward_hook on a jitted function yet it’s never called at all. I did notice there’s an issue open about this (Support backward hooks in JIT · Issue #51969 · pytorch/pytorch · GitHub).

Are there any work-around solution to this?

Any help would be greatly apprecitated! :slight_smile:

"TorchScript has full support for PyTorch’s tape-based autograd. You can call backward() on your tensors if you are recording gradients and it should work. "
maybe this will work. I will try.