Implement custom LayerNormalization layer for channel-wise normalization

Hi, currently pytorch supports LayerNorm operation with normalized_shape in the form [∗×normalized_shape[0]×normalized_shape[1]×…×normalized_shape[−1]]. However, I want to implement a custom LayerNorm operations that only normalize across channels. For examples, given input size [N, C, W, H], the custom layer would normalize across C, but not W or H. I tried to look for the source code of LayerNorm implementation and then tried to modify it but couldn’t find the source code. Can anybody point me to the location of the source code? Also, I’m looking at this tutorial Extending PyTorch — PyTorch master documentation to try to write the custom layer. It’d also be great if someone can show me necessary steps to build the described custom layer, or some relevant tutorials. Thank you very much!

2 Likes

You can use permute to apply LayerNorm to any dimensions you want. For your image example, this should do the trick:

from torch.nn.functional import layer_norm

img = torch.rand((1, 3, 256, 256))
# Send the channel axis to the end
channels_last = img.permute(0, 2, 3, 1)
# Apply LayerNorm
normed = layer_norm(channels_first, normalized_shape=[3])
# Put channel axis back
normed_img = normed.permute(0, 3, 1, 2)