Layer fusion not fusing BatchNorm

fused_model = copy.deepcopy(model)
model.eval()
fused_model.eval()
fused_model = torch.quantization.fuse_modules(fused_model, [["conv1", "bn1", "relu"]], inplace=False)

Original model:

ResNet(
  (conv1): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
  (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (relu): ReLU(inplace=True)
  (maxpool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)

Fused model:

ResNet(
  (conv1): ConvReLU2d(
    (0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3))
    (1): ReLU(inplace=True)
  )
  (bn1): Identity()
  (relu): Identity()

Shouldn’t ConvBnReLU2d be the block name?

What do you mean by block?

A block can also be thought of as a layer.

ConvReLU2d is the type of the fused module, from fusing (conv - bn - relu) modules i nthe model. and after fusion we assign the result to conv module. What is the question?