Why forward_hook has mistake?

I want to visual the BatchNorm2d performance in resnet18, and I use register_forward_hook to do.

bn2d_inputs = {}
bn2d_outputs = {}
relu_inputs = {}
relu_outputs = {}
def get_layer_details(idx):
    def hook(layer, input, output):
        if isinstance(layer,torch.nn.BatchNorm2d):
            print(f"layer.__class__.__name__: {layer.__class__.__name__}")
            print(f"Input shape: {input[0].shape}")
            print(f"Output shape: {output.shape}")
            bn2d_inputs[str(idx) + "_" + layer.__class__.__name__] = input
            bn2d_outputs[str(idx) + "_" + layer.__class__.__name__] = output
            print("weight:",layer.weight)
            print("bias:",layer.bias)
        if isinstance(layer,torch.nn.ReLU):
            relu_inputs[str(idx) + "_" + layer.__class__.__name__] = input
            relu_outputs[str(idx) + "_" + layer.__class__.__name__] = output
    return hook

for idx,(name, layer) in enumerate(resnet18.named_modules()):
    layer.register_forward_hook(get_layer_details(idx))

I got the bn2d outputs all positive(>0),it’s seem like the bn2d+relu output, but my code wants to get the bn2d output.
What’s my mistake?

torchvision.models.resnet18 uses inplace nn.ReLU modules. Replace them with their out-of-place version and you should see the batchnorm output.