Graph Mode Quantization Aware Training in PyTorch

Hi,

The Graph Mode Post Training Quantization in Pytorch is awesome because it doesn’t need the model to be defined in certain way only using Modules. (https://pytorch.org/tutorials/prototype/graph_mode_static_quantization_tutorial.html)

Wondering if there is a plan for Graph Mode Quantization Aware Training as well in Pytorch?

Thanks,
Manu

Hi Manu,

Yes. It’s very much work in progress and the code is at https://github.com/pytorch/pytorch/tree/master/torch/quantization/fx Tutorials and docs will be released once it’s ready.

You can monitor the progress here: https://github.com/pytorch/pytorch/issues/45635

Thanks Daya,

What would be the rough timeline for graph mode QAT to be available - will it be included in PyTorch 1.7 release?

Also wondering what would be the difference between quantize_jit and quantize_fx

It’s not a part of 1.7. quantize_jit is the current api that you used for graph-mode post-training quantization and it will be deprecated once quantize_fx is available (i.e., new graph mode).

1 Like