Does torchscript support custom autograd function?

does torchscript support custom autograd function?

I implemented a custom function followed this link:
https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html

Does torchsceipt support this kind of function? I followed this torchscript tutorial:TorchScript and an example

However, I got error “ValueError: Compiled functions can’t take variable number of arguments or use keyword-only arguments with defaults”, and it’s my custom autograd function caused this error. I assumed that the compiler doesn’t know about the argument type of the custom autograd function. I cannot find how to make custom autograd work with torchscript.

We do not currently support custom autograd functions, but it is something on our radar that we would like to do in the future. You can find more context in this issue.

It is also possible to replicate most of the behavior in custom autograd functions now via custom C++ operators.

Hi, Any updates on torchscript for autograd function ?