Disclaimer: I’m not sure if this is a bug or if I’m missing something.
I’m trying to export to ONNX a Torch model that uses some custom CUDA ops. Following the torch.onnx guide, I define a symbolic function similar to:
def my_symbolic_forward(g, arg0, arg1, int_arg2):
arg2_i = sym_help._maybe_get_const(int_arg2, "i")
# [omitted code where I calculate shape information]
return g.op(
"my_namespace::my_forward",
arg0,
arg1,
arg2_i=arg2_i,
)
The function is then registered via:
torch.onnx.register_custom_op_symbolic("my_namespace::my_forward", my_symbolic_forward, 1)
This was tested and works in Torch 1.9 and 1.13, however in Torch 2.0.1 I get an AttributeError that looks related to onnxscript. The error is triggeret at this line, in function _find_onnxscript_op
, which looks odd, considering I did not go the onnxscript-way to export the model with my custom op.
Does this look like a bug (in which case I’ll open an issue on GitHub), or did I miss something telling pytorch that it should not treat this as onnxscript?