How to fuse layers of any convolutional neural network?

Hello,
Is there any generalized way or code to fuse layers of any convolutional model ?
ex. AlexNet, ResNet, VGG
just one code which will work for all sort of model to fuse their conv + bn + relu layers.

Any update on this topic?
Thank you.

Check here.

Simply, for fusing conv + bn + relu, you can replace the convolution operators to these operators, and then replace bn operator and relu operator to nn.Identity().

PS: Please check the correctness by yourself.

this can be achieved by FX Graph Mode Quantization, which will be released as a prototype feature in PyTorch 1.8