L2 Normalization Layer

Hi, I am trying to implement an L2 normalization layer. But I see that the F.normalize is not accepted by the sequential module as it requires an input. Is there an L2 normalization layer in pytorch?

self.backbone = torch.nn.Sequential(self.conv_b1,
                                            self.conv_b2,
                                            self.conv_b3,
                                            self.conv_b4_1,
                                            self.conv_b4,
                                            self.conv_b5,
                                             F.normalize())

TypeError: normalize() missing 1 required positional argument: ‘input’

F.normalize() is a function from the torch.nn.functional module and cannot be used directly inside a nn.Sequential container. However, you can easily create a custom normalization layer by subclassing nn.Module and wrapping the F.normalize() function. Something like below

class L2NormalizationLayer(nn.Module):
    def __init__(self, dim=1, eps=1e-12):
        super(L2NormalizationLayer, self).__init__()
        self.dim = dim
        self.eps = eps

    def forward(self, x):
        return F.normalize(x, p=2, dim=self.dim, eps=self.eps)
1 Like

Thanks so much for your response. My input tensor to the L2 normalization layer is a 128 x 512 x 14 x 14. N x C x H x W. Will this be performing the normalization over the Batch or over the channel dimension?

If you would like to perform normalization over the batch dimension, you can change the dim parameter to 0 .However, normalizing over the batch dimension is not a common practice, as it would mix information between different samples in the batch.

1 Like