The bug of torch.norm function?

Hello, I am using the torch.norm in my implementation (Pytorch 1.0.0a0+35c8f93). I noticed that when the tensor size turns to be huge, torch.norm does not return the answer that I want.

For example, I have a tensor of ones of size 4000000. The 2-norm of this tensor should be exactly sqrt(4000000) = 2000.

import torch
a = torch.ones([4000000])
torch.norm(a) #https://pytorch.org/docs/stable/torch.html#torch.norm
However, the function returns 22132.3770

while

torch.sum(a**2) did return the correct norm square of this tensor.

Does anyone have the same issue using the norm function?
Thanks in advance,

Chuan

Hi,

I ran the same code in my windows machine and pycharm IDE. I am getting 2000. No issues from my side.

Thanks

Which PyTorch version are you using?
It might be related to this issue which should be fixed by now in the latest stable version.

1 Like

I am using the version 1.0.0. That’s the reason. Thanks a lot.