LayerNorm with variable shapes

Hi, I have a CNN that accepts inputs of shape (4,H,W) where H and W can vary. I would like to add a LayerNorm to normalize across the first dimension with a shape of 4. But as I don’t know what H and W, I can’t create a nn.LayerNorm object. Is there any way to use LayerNorms with variable input shapes?

Using LayerNorm seems tricky for your case because LayerNorm learns two parameters, weight and bias that depend on the normalized shape.

Hmm yeah, it seems like I can only use layernorm without weights using torch.nn.functional.layer_norm. Are there any other normalizations that I can use in this usecase?