Onnx export failed int8 model

torch 1.8.1 suuport QAT models to ONNX? how to do that?

Hi, I met same problem like:

_trace.py", line 71, in _unique_state_dict
    filtered_dict[k] = v.detach()
AttributeError: 'torch.dtype' object has no attribute 'detach'

when try exporting a quantized model to onnx.

I am using torch.fx to do so.

the error seems happens exactly like op’s, I can’t see any clue that this problem was solved.

It doesn’t even for pytorch 1.11 I think.