Convert quantized model to ONNX format

Hi Ahmed,

As far as I know exporting to ONNX is not officially supported, and we are not actively working on this integration. However, here’s a thread that you may find useful: ONNX export of quantized model - #32 by tsaiHY. I would guess more complex ops like LSTM/GRUs in particular are not well supported.

Best,
-Andrew