Normalizing a tensor along a dimension

I have a tensor X of shape [B, 3 , 240, 320] where B represents the batch size 3 represents the channels, 240 the height, 320 the width.

I need to find the norm along the channels dimension(3channels) and normalize along that dimension, i.e. each subtensor comprising of the 3 channels should have norm 1.

Can you all please suggest how I can do that?

Thanks!

when you say β€œnorm” which norm do you mean? Standard norm? Softmax norm? Euclidean norm?

Thanks for replying @AlphaBetaGamma96

L2 norm

In that case, you should be able to try this!

X = torch.randn(B,3,240,320)
norm = X.pow(2).sum(dim=1).sqrt()
Xnorm = X/norm

The L2-norm (or Euclidean norm) is just the square root of the sum of the squares, i.e. norm = sqrt(x^2 + y^2 + …). In your case, you just need to squared the Tensor (via .pow(2)) then sum along the dimension you wish to normalize (via .sum(dim=1)), then take the square root (via .sqrt() ). That calculates your normalization constant. Then just divide the original Tensor by that value and it should normalize your Tensor along that dimension!

1 Like

@AlphaBetaGamma96
I tried:

Xnumerator = torch.cross(Xleft, Xright, dim=1)
norm2 = Xnumerator.pow(2).sum(dim=1).sqrt()
nu = Xnumerator / norm2

I am getting this error:

nu = Xnumerator / norm2
RuntimeError: The size of tensor a (3) must match the size of tensor b (60) at non-singleton dimension 1

Xnumerator β†’ torch.Size([60, 3, 240, 320])
norm2 β†’ torch.Size([60, 240, 320])

1 Like

This is because your Tensors have different sizes,

You need to add the keepdim=True argument within your sum to keep the dimensions the same size. This works,

Xnumerator = torch.cross(Xleft, Xright, dim=1)
norm2 = Xnumerator.pow(2).sum(keepdim=True, dim=1).sqrt()
nu = Xnumerator / norm2