# Norm of a Tensor

For a 2-dimensional tensor, I want to normalize each row vector of it. In the formula, it’s like:

I know such code below can solve easily:

``````embedding_norm = Torch.norm(embedding.weight.data, dim=1, keepdim=True)
``````

But I don’t understand why the value of parameter dim = 1. I’ve read doc. where it says the value of dim must be an int to calculate vector norm, but why the number is 1 here?

dim refers to which dimension normalize with respect
your two dimensional tensor has 2 dims, (x,y) which corresponds to dims (0,1)
if you set 0 it’s row-wise normalization.
If you set 1 it’s colum.wise normalization
extrapolable for N-dim tensors

I don’t know the complete example, but most likely the dimension 0 in your example is the batch size, so you want to get the norm of each vector from the batch.