What is the equivalent of np.std() in Pytorch?

To get the mean and variance in tensorflow just use tf.nn.moments.

mean, var = tf.nn.moments(x, axes=[1])

and in numpy mean, var = np.std(x, axes=[1])

What is the equivalent function of getting variance in pytorch?


Tensor.mean([dim]), Tensor.std([dim]), and Tensor.var([dim]) for mean, standard deviation, and variance optionally along an axis (dim).

>>> import torch
>>> x = torch.Tensor([[0, 1], [2, 3]])
>>> x.mean(dim=1)
[torch.FloatTensor of size 2]
>>> x.std(dim=1)
[torch.FloatTensor of size 2]

Thanks for your reply. In pytorch, is gradients calculate for mean and standard deviation automatically?

Thanks @colesbury. It works for me.

import torch
a = torch.randn(4, 4)
torch.std(a, dim=1)

Btw, the std computed by numpy is different!

import numpy
>>> x = numpy.array([[0, 1], [2, 3]])
>>> x.std(1)
array([0.5, 0.5])

Which one should we use if we want e.g. to compute the std for transforms.Normalize(mean, std) ?

1 Like

@kuzand np.std(), np.var() have an additional parameter ddof which is the degrees of freedom. Type in x.std(1,ddof=1) and you’ll get the same answer. Regarding transforms, all ddof does is to divide by a different denominator, so I don’t think that will affect it much. Look at this thread here.