BatchNorm not fusing with Cone and ReLU

I am trying to fuse Conv, batch norm and relu layers in a Resnet18 to prepare for QAT.
However after I run this:

model.eval()

torch.ao.quantization.fuse_modules(model, [[“conv1”,“bn1”,“relu”]], inplace=True)

and see the model now:

ResNet(
(conv1): ConvReLU2d(
(0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3))
(1): ReLU(inplace=True)
)
(bn1): Identity()
(relu): Identity()

This is how it looks, the conv and relu layers are getting fused, but batch norm layer is not fused.

Please let me know the reason behind it.
Thanks