Pytorch quantized model to ONNX - quantized_decomposed::quantize_per_tensor Error

Hi All, need a quick help!!

I am trying to convert a quantized pytorch model to ONNX format.
This gives me following error:

UnsupportedOperatorError(torch.onnx.errors.UnsupportedOperatorError: ONNX Export failed on an operator with unrecognized namespace quantized_decomposed::quantize_per_tensor. If you are trying to export a custom operator, make sure you are registered it with the right domain and version.

Can you let me know why is this pytorch quantized_decomposed::quantize_per_tensor is not supported for ONNX format conversion.
Or if there is a workaround

Thanks

I am using xnnpack_quantizer

you’ll need to ask onnx team to add the op I think: Issues · onnx/onnx · GitHub

UnsupportedOperatorError: ONNX Export failed on an operator with unrecognized namespace quantized_decomposed::quantize_per_tensor

I have met the same question