ONNX export of quantized model

Hi, I try to export a model quantized by pytorch-quantization, but I get this error:RuntimeError: ONNX export failed: Couldn’t export Python operator FakeTensorQuantFunction

1 Like