Which activation suggested for QAT

For QAT, which kind of activation function is suggested to use? (RElu, Relu6, Hardthanh)

if you are going to use QAT and need an activation, its probably a good idea to fuse your op if possible. A list of fusions can be found here: pytorch/fusion_patterns.py at 10411e356160e1d0f406a9cee435e51f95fa90fa · pytorch/pytorch · GitHub