Types of layers for Quantization

What types of layers are supported in PyTorch’s quantization framework? (especially ones that are more related to convnets).
I found Conv2D, Conv3D, Relu, but I couldn’t find any types of BatchNorm)

I’ll move the category to Quantization to get a bit more visibility for this topic. :wink:

1 Like

you can checkout https://pytorch.org/docs/stable/quantization.html#operation-coverage, this might be a little bit out of date, to find the most up to date supported ops, you can take a look at: https://github.com/pytorch/pytorch/tree/master/aten/src/ATen/native/quantized/cpu