does torchscript support custom autograd function?
I implemented a custom function followed this link:
https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html
Does torchsceipt support this kind of function? I followed this torchscript tutorial:TorchScript and an example
However, I got error “ValueError: Compiled functions can’t take variable number of arguments or use keyword-only arguments with defaults”, and it’s my custom autograd function caused this error. I assumed that the compiler doesn’t know about the argument type of the custom autograd function. I cannot find how to make custom autograd work with torchscript.