I’d like to apply layernorm to a specific dimension of my tensor.
N=1
C=10
H=10
W=2
input = torch.randn(N, C, H, W)
^
In the above example, I’d like to apply layernorm along the C dimension.
Looking at the LayerNorm documentation, as I understand it, you can only tell nn.LayerNorm the size of dimension to which you’d like to apply layernorm. I think this creates a problem if you have 2 dimensions of the same size, and you’d like to apply layernorm to the leftmost dimension.
Concretely, if I do the following, I believe it actually applies layernorm to dimension H, because it is the same size as dimension C, and it is further right in the list of dimensions.
N=1
C=10
H=10
W=2
input = torch.randn(N, C, H, W)
layernorm = nn.LayerNorm(C)
output = layernorm(input)
Is there a way around this?
I suppose one solution is to transpose (perhaps using permute
) before calling LayerNorm, but that feels a bit inelegant.