Add normalization layer in the beginning of a pretrained model

I’m using a pretrained UNet model whose first encoder has the following architecture

UNet(
  (encoder1): Sequential(
    (enc1conv1): Conv2d(3, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    (enc1norm1): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (enc1relu1): ReLU(inplace=True)
    (enc1conv2): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    (enc1norm2): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (enc1relu2): ReLU(inplace=True)
  )

The model takes an input which has been normalized using min-max normalization. Instead, I want to add a batch/layer norm layer at the beginning so that I can feed the image as it is without normalization.

I don’t want to use torchvision.transforms to normalize the image, instead I want to add a layer at the beginning that does the same work for me.

Sorry, if this question has some flaws, I’m new to Pytorch.

Batchnorm layers do not work in the same way as the transforms.Normalize transformation, since the latter uses static statistics while the batchnorm layer uses running estimates during validation and the batch statistics during training.

However, in case you still want to add this layer:
based on the model description, I assume you are using a custom nn.Module, which uses self.encoder1 as one submodule.
If that’s the case, add self.bn1 = nn.BatchNorm2d(3) to the __init__ method of UNet and call it in the forward method before executing self.encoder1.