Fbgemm not available

Out[3]: [‘qnnpack’, ‘none’]

fbgemm is not supported.

I tried to install fbgemm on my Ubuntu 20 and reinstalled pytorch 1.8. The problem is still there.

and also I tried to use qnnpack to run Quantization Aware Training on Bert. No matter I set the Embedding layer to be ‘None’ or ‘float16_dynamic_qconfig’, I got a error message:
Didn’t find engine for operation quantized::linear_prepack NoQEngine