Hi, I am trying to implement an L2 normalization layer. But I see that the F.normalize is not accepted by the sequential module as it requires an input. Is there an L2 normalization layer in pytorch?
F.normalize() is a function from the torch.nn.functional module and cannot be used directly inside a nn.Sequential container. However, you can easily create a custom normalization layer by subclassing nn.Module and wrapping the F.normalize() function. Something like below
Thanks so much for your response. My input tensor to the L2 normalization layer is a 128 x 512 x 14 x 14. N x C x H x W. Will this be performing the normalization over the Batch or over the channel dimension?
If you would like to perform normalization over the batch dimension, you can change the dim parameter to 0 .However, normalizing over the batch dimension is not a common practice, as it would mix information between different samples in the batch.