Does pytorch support quantized model conversion to onnx

I am trying to convert a quantied model trained in pytorch to onnx. And then got

  File "test_QATmodel.py", line 276, in test
    torch.onnx.export(model_new, sample, 'quantized.onnx')#, opset_version=11, operator_export_type=torch.onnx.OperatorExportTypes.ONNX_ATEN_FALLBACK)
  File "/monly/workspaces/bigtree/miniconda3/envs/color_detcet/lib/python3.7/site-packages/torch/onnx/__init__.py", line 280, in export
    custom_opsets, enable_onnx_checker, use_external_data_format)
  File "/monly/workspaces/bigtree/miniconda3/envs/color_detcet/lib/python3.7/site-packages/torch/onnx/utils.py", line 94, in export
    use_external_data_format=use_external_data_format)
  File "monly/workspaces/bigtree/miniconda3/envs/color_detcet/lib/python3.7/site-packages/torch/onnx/utils.py", line 695, in _export
    dynamic_axes=dynamic_axes)
  File "monly/workspaces/bigtree/miniconda3/envs/color_detcet/lib/python3.7/site-packages/torch/onnx/utils.py", line 459, in _model_to_graph
    _retain_param_name)
  File "monly/workspaces/bigtree/miniconda3/envs/color_detcet/lib/python3.7/site-packages/torch/onnx/utils.py", line 422, in _create_jit_graph
    graph, torch_out = _trace_and_get_graph_from_model(model, args)
  File "monly/workspaces/bigtree/miniconda3/envs/color_detcet/lib/python3.7/site-packages/torch/onnx/utils.py", line 370, in _trace_and_get_graph_from_model
    orig_state_dict_keys = _unique_state_dict(model).keys()
  File "monly/workspaces/bigtree/miniconda3/envs/color_detcet/lib/python3.7/site-packages/torch/jit/_trace.py", line 71, in _unique_state_dict
    filtered_dict[k] = v.detach()
AttributeError: 'NoneType' object has no attribute 'detach'

Does the quantized model->onnx supported today?

I believe this thread goes through the existing support for ONNX: ONNX export of quantized model - #23 by ZyrianovS