Fuse ModuleList for quantization aware training

Hi, I’m trying to perform quantization aware training. Currently, I have a customized Conv module which consists of a ModuleList of several Conv2d modules. This is then followed by a ModuleList of several BatchNorm2d modules, e.g., ModuleList(Conv2d(), Conv2d()) - ModuleList(BatchNorm2d(), BatchNorm2d).

According to this, a fuse operation is required. However, it looks like this operation only supports a limited number of combinations (e.g., conv-bn-relu, conv-bn, etc.). I’m wondering,

  • How to fuse ModuleLists in order to use PyTorch quantization machanism?
  • Can PyTorch quantization aware training work without fusing these modules?

Please help! Thanks!

1 Like