Any known workarounds for exporting nn.Transformer model to onnx?

Exporting a vanilla nn.Transformer via onnx, give me the error that aten::unflatten is not implemented. I see a few patches in the last few days related to this.

Without waiting for next release Is there a viable workaround?

Which version of torch are you using? Looks like this was already fixed in nightlies UnsupportedOperatorError: Exporting the operator 'aten::unflatten' to ONNX is not supported. · Issue #98190 · pytorch/pytorch · GitHub

I’m using 2.0.0, how do I get nightlies?

Go to pytorch.org and select nightlies, your os and desired cuda version https://pytorch.org/

Awesome, I’ll give it a try, thanks so much.

Thanks, using the nightly solved the issue!

1 Like

Nightly build seems to be working for me without any error, but the export itself is extremely slow with opset version 16