Why np.linalg.norm has different output with tensor.norm?

It’s realy weird for the following small code snippet to has different result for np.linalg.norm and torch.norm.

import  torch
from    torch import nn
from    torch.nn import functional as F
import  numpy as np


def main():

    out_channels = 16
    kernelsz = 5
    stride = 1
    padding = 0

    torch.manual_seed(22)

    w = torch.randn(out_channels, 3, kernelsz, kernelsz, requires_grad=False)
    b = torch.randn(out_channels, requires_grad=False)
 
 


    for i in range(100):

        x = torch.randn(4, 3, 28, 28)
        out1 = F.conv2d(x, w, b, stride=stride, padding=padding)
 
        print(i, out1.norm().item(),  np.linalg.norm(out1.detach().numpy()) )

if __name__ == '__main__':
    main()

it outputs:

o@m:~/arc/$ python myF.py 
0 2070.43212890625 1630.2097
1 2105.404541015625 1656.4425
2 2107.125244140625 1656.174
3 2130.1220703125 1678.5837
4 2120.11669921875 1662.5927
5 2096.589599609375 1650.8307
6 2068.27001953125 1625.7941
7 2120.5830078125 1667.673
8 2090.04443359375 1640.3636
9 2107.760986328125 1655.7458
10 2107.1748046875 1660.1805
...

Anyone help me dig out the bugs ? Thank you.

I got the same output on MacOS and PyTorch 1.0.1.

0 1630.209716796875 1630.2097
1 1656.4425048828125 1656.4425
2 1656.1739501953125 1656.174
3 1678.583740234375 1678.5837
4 1662.5927734375 1662.5927

Here is from torch.norm() document:

If the input tensor has more than two dimensions, the vector norm will be applied to last dimension.

The following is from numpy.linalg.norm() document:

Input array. If axis is None, x must be 1-D or 2-D.

The behavior of numpy.linalg.norm is not defined when your input array has more than 2D. I don’t know how it is computed, but it is not the proper usage of numpy.linalg.norm. In this case, I think this function should have raised some errors or whatever.

I guess there is a bug with norm cpu version in pytorch 1

I think this might be related to this issue.

1 Like

@all. Yes, it’s a bug in PyTorch 1.0.0. updated to 1.0.1 solve this bug.