Quantized Pytorch model exports to onnx

I think the error output may be a bit misleading. I’ll take a look at the backend. Did you see this post: ONNX export of quantized model - #17 by mhamdan?