Unclear about relation between TorchScript and backprop


I am trying to write a TorchScript operator to replace a custom torch.autograd.Function to influence the gradient computation, since TorchScript cannot serialize calls to custom Functions.
Is there an equivalent of torch.autograd.Function in ATen or Torch?

Best regards

(Ailing Zhang) #2

@leowalkling, could you elaborate more on what you want to replace here with a example? In case it’s helpful, in JIT we currently put ops to 2 categories:

  1. Ops have autodiff formulas in autodiff.cpp: their backward graph is replaced by https://github.com/pytorch/pytorch/blob/0cb24098c74f8ebed81ec08b83bf6cb5ab3903f5/torch/csrc/jit/graph_executor.cpp#L76
  2. Ops don’t have autodiff formulas: their backwards are handled by eager mode autograd directly.
    Please let us know the context of problem so that we can help more on this.