Mixed Precision Training and Quantisation Aware Training together


I’m just wondering whether it is possible to perform mixed precision training and quantisation aware training together?

I’m working on image classification model with DDP approach. I tried mixed precision training separately and quantisation aware training separately.

Thought to get clarification before I implement that.

@ptrblck @suraj.pt @albanD

there is a known issue with AMP + QAT as mentioned in QAT + torch.autocast does not work with default settings, missing fused fake_quant support for half · Issue #94371 · pytorch/pytorch · GitHub. There is a workaround that you can use though to get it to work by using version=0 in get_default_qat_qconfig_mapping as mentioned in the issue.