Performance drop when quantizing Efficientnet

I just ran the activation comparison suggested in the guide provided, and got:

conv_stem.0.stats tensor(28.3666)
conv_stem.1.stats tensor(28.3666)
blocks.0.0.conv_dw.0.stats tensor(16.1361)
blocks.0.0.conv_dw.1.stats tensor(16.1361)
blocks.0.0.conv_pw.stats tensor(8.5438)
blocks.1.0.conv_pw.0.stats tensor(7.0812)
blocks.1.0.conv_pw.1.stats tensor(10.7929)
blocks.1.0.conv_dw.0.stats tensor(10.3284)
blocks.1.0.conv_dw.1.stats tensor(11.8796)
blocks.1.0.conv_pwl.stats tensor(6.0492)
blocks.1.1.conv_pw.0.stats tensor(9.7360)
blocks.1.1.conv_pw.1.stats tensor(11.2618)
blocks.1.1.conv_dw.0.stats tensor(8.9654)
blocks.1.1.conv_dw.1.stats tensor(9.1349)
blocks.1.1.conv_pwl.stats tensor(5.3888)
blocks.2.0.conv_pw.0.stats tensor(8.9415)
blocks.2.0.conv_pw.1.stats tensor(10.1787)
blocks.2.0.conv_dw.0.stats tensor(12.5325)
blocks.2.0.conv_dw.1.stats tensor(14.5331)
blocks.2.0.conv_pwl.stats tensor(5.8452)
blocks.2.1.conv_pw.0.stats tensor(10.9424)
blocks.2.1.conv_pw.1.stats tensor(11.9166)
blocks.2.1.conv_dw.0.stats tensor(10.7086)
blocks.2.1.conv_dw.1.stats tensor(11.7042)
blocks.2.1.conv_pwl.stats tensor(3.9516)
blocks.3.0.conv_pw.0.stats tensor(7.9058)
blocks.3.0.conv_pw.1.stats tensor(8.7798)
blocks.3.0.conv_dw.0.stats tensor(13.6778)
blocks.3.0.conv_dw.1.stats tensor(15.0221)
blocks.3.0.conv_pwl.stats tensor(7.0661)
blocks.3.1.conv_pw.0.stats tensor(11.2245)
blocks.3.1.conv_pw.1.stats tensor(12.1855)
blocks.3.1.conv_dw.0.stats tensor(10.3169)
blocks.3.1.conv_dw.1.stats tensor(7.3186)
blocks.3.1.conv_pwl.stats tensor(5.9016)
blocks.3.2.conv_pw.0.stats tensor(10.9814)
blocks.3.2.conv_pw.1.stats tensor(12.2782)
blocks.3.2.conv_dw.0.stats tensor(11.5729)
blocks.3.2.conv_dw.1.stats tensor(6.8540)
blocks.3.2.conv_pwl.stats tensor(4.0227)
blocks.4.0.conv_pw.0.stats tensor(9.5918)
blocks.4.0.conv_pw.1.stats tensor(10.4552)
blocks.4.0.conv_dw.0.stats tensor(11.8454)
blocks.4.0.conv_dw.1.stats tensor(12.2951)
blocks.4.0.conv_pwl.stats tensor(4.5780)
blocks.4.1.conv_pw.0.stats tensor(9.8242)
blocks.4.1.conv_pw.1.stats tensor(9.5439)
blocks.4.1.conv_dw.0.stats tensor(12.6775)
blocks.4.1.conv_dw.1.stats tensor(10.9211)
blocks.4.1.conv_pwl.stats tensor(2.9198)
blocks.4.2.conv_pw.0.stats tensor(9.9729)
blocks.4.2.conv_pw.1.stats tensor(9.4751)
blocks.4.2.conv_dw.0.stats tensor(14.5569)
blocks.4.2.conv_dw.1.stats tensor(12.2109)
blocks.4.2.conv_pwl.stats tensor(3.3256)
blocks.5.0.conv_pw.0.stats tensor(10.7336)
blocks.5.0.conv_pw.1.stats tensor(9.2929)
blocks.5.0.conv_dw.0.stats tensor(19.4747)
blocks.5.0.conv_dw.1.stats tensor(21.1074)
blocks.5.0.conv_pwl.stats tensor(8.3158)
blocks.5.1.conv_pw.0.stats tensor(12.8702)
blocks.5.1.conv_pw.1.stats tensor(12.2446)
blocks.5.1.conv_dw.0.stats tensor(14.1980)
blocks.5.1.conv_dw.1.stats tensor(12.0078)
blocks.5.1.conv_pwl.stats tensor(7.1764)
blocks.5.2.conv_pw.0.stats tensor(13.4789)
blocks.5.2.conv_pw.1.stats tensor(12.8941)
blocks.5.2.conv_dw.0.stats tensor(15.1403)
blocks.5.2.conv_dw.1.stats tensor(13.3021)
blocks.5.2.conv_pwl.stats tensor(6.3677)
blocks.5.3.conv_pw.0.stats tensor(13.3304)
blocks.5.3.conv_pw.1.stats tensor(13.2739)
blocks.5.3.conv_dw.0.stats tensor(16.0722)
blocks.5.3.conv_dw.1.stats tensor(14.6379)
blocks.5.3.conv_pwl.stats tensor(8.0309)
blocks.6.0.conv_pw.0.stats tensor(12.9786)
blocks.6.0.conv_pw.1.stats tensor(13.6662)
blocks.6.0.conv_dw.0.stats tensor(16.3897)
blocks.6.0.conv_dw.1.stats tensor(17.3638)
blocks.6.0.conv_pwl.stats tensor(6.5583)
conv_head.0.stats tensor(3.8746)
conv_head.1.stats tensor(3.8746)
quant.stats tensor(34.5170)
classifier.stats tensor(6.9768)

Is there anything suspicious here?