Quantized model consists of ReLU6

Hi, Can I quantize model which consists of ReLU6 activation function?

1 Like

Yes you can. ReLU6 was added to DEFAULT_MODULE_MAPPING. See Quantized hard sigmoid

hi @pshashk, I did tried to fuse model with ReLU6 activation function. It throws an error. I see in pytorch source code that fuse modules currently support 4 types sequence of modules:

  Fuses only the following sequence of modules:
    conv, bn
    conv, bn, relu
    conv, relu
    linear, relu

https://github.com/pytorch/pytorch/blob/master/torch/quantization/fuse_modules.py

That is correct, we will work on adding support for fusing relu6 soon. For now, if you are doing post training quantization, you could replace relu6 with relu and proceed as a work around.
Thanks,

1 Like

thank you. Hope you release it soon

@raghuramank100 can you provide an example of how to replace relu6 with relu ? I am trying to quantize a network with relu6 activations.