Why nn.BatchNorm2d returns different values comparing with the one calculated by math equation defined in paper

May I know why?

Below is the one calculated by following the formula define in paper.

… math::

    y = \frac{x - \mathrm{E}[x]}{\sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta
class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        #self.conv2_bn = nn.BatchNorm2d(16, track_running_stats=False)
        #nn.init.constant_(self.conv2_bn.weight, 1.135)
        #nn.init.constant_(self.conv2_bn.bias, 0.2756)

    def forward(self, x):
        m = torch.mean(x)
        v = torch.var(x, unbiased=False)
        z = (x - m) / math.sqrt(v + 1e-5)
        x = z * 1.135 + 0.2756
        return x

torch.manual_seed(1)
n = Net()
n.eval()
x = torch.rand((1, 16, 2, 2))
y = n.forward(x)
print(y)

The one is calculated by nn.batchnorm2d():

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv2_bn = nn.BatchNorm2d(16, track_running_stats=False)
        nn.init.constant_(self.conv2_bn.weight, 1.135)
        nn.init.constant_(self.conv2_bn.bias, 0.2756)

    def forward(self, x):
        x = self.conv2_bn(x)
        return x

torch.manual_seed(1)
n = Net()
n.eval()
x = torch.rand((1, 16, 2, 2))
y = n.forward(x)
print(y) 

These two values are different… May I know why?

The output for m and v end up to be a single-valued Tensor.
Instead they must be 16-valued Tensors.

You must replace the code to

m = torch.mean(x, dim= (0, 2,3))
v = torch.var(x, dim=(0, 2, 3), unbiased=False)

@chetan_patil Many thanks for your prompt. I ignored this and after fixed that, the results are same.