Does pytorch support quantized model conversion to onnx

I believe this thread goes through the existing support for ONNX: ONNX export of quantized model - #23 by ZyrianovS