Error when export model to ONNX through dynamo_export

As the latest version of PyTorch has that api dynamo_export, and it seems designed to export ONNX model from dynamo, so I have 2 questions below:

    1. What’s the difference between torch.onnx.export and torch.onnx.dynamo_export ? They both process normal nn.Modules and then generate a ONNX model.
    1. I even can not export a resnet50 from mmpretrain through dynamo_export, and I don’t find any document for it, so is it in early stage?

They differ in how you graph capture. Regular export uses JIT and Dynamo export uses Dynamo

Dynamo export is still in early stages but will likely become the default in the future given that torchscript development was paused

Thanks mark!

Btw, as this dyamo_export is taking a normal nn.Module in and outputs an ONNX model, can we make a api that enable us to put a traced fx.Graph into it then output an ONNX model?

Since I found that PyTorch is also working on another quantization tool pt2e, so this api can be usefule that we quantize, optimize models in fx.Graph format, then export to ONNX model for serving.

cc @BowenBao who is maybe already looking into this

I would like to know where are we on this. I have fx.graphmodule generated form pt2e after quantization, I am not able to export such a graphmodule to onnx. I am really interested in this because I got the best quantization result using pt2e compare dot eager mode.

1 Like