is there a layer normalization for 2d feature maps which has the same function with tf.contrib.layers.layer_norm in tensorflow?
from what i see, no one has implemented it yet.
if batch size is always 1,can i do it by reshape a (1,C,H,W) tensor to (1,CHW),and then tranpose it to (CHW,1) and apply batchnorm ? then reshape it back to (1,c,h,w)
Use the GroupNorm as followed:
nn.GroupNorm(1, out_channels)
It is equivalent with LayerNorm. It is useful if you only now the number of channels of your input and you want to define your layers as such
nn.Sequential(nn.Conv2d(in_channels, out_channels, kernel_size, stride), nn.GroupNorm(1, out_channels), nn.ReLU())
Now InstanceNorm2d is implemented in pytorch which can be used as LayerNorm for 2DConv.
InstanceNorm2d
and LayerNorm
are very similar, but have some subtle differences. InstanceNorm2d
is applied on each channel of channeled data like RGB images, but LayerNorm
is usually applied on entire sample and often in NLP tasks. Additionally, LayerNorm
applies elementwise affine transform, while InstanceNorm2d
usually don’t apply affine transform.
https://pytorch.org/docs/stable/generated/torch.nn.InstanceNorm2d.html