Quantized Pytorch model exports to onnx

Hi guys,

Conversion of Torchvision (v0.11) Int8 Quantized models to onnx produces the following error.

AttributeError: 'torch.dtype' object has no attribute 'detach'

Is it not supported yet?

1 Like