# transform.Normalize isn't idempotent?

I’m confused by what `transform.Normalize` does. If it just normalizes a tensor to a mean and stdev, then shouldn’t it be idempotent? I find if I run the same normalization multiple times it gives different values. Also it seems that it doesn’t actually set the mean to what it says it does. An example below:

``````import torch
from torchvision import transforms

normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
vec = torch.empty((3,10,10)).random_(5)

print(torch.mean(vec, axis=(1,2)))
print(torch.mean(normalize(vec), axis=(1,2)))
print(torch.mean(normalize(normalize(vec)), axis=(1,2)))
``````

Running this prints:

``````tensor([2.0000, 1.9800, 2.1500])
tensor([6.6157, 6.8036, 7.7511])
tensor([26.7717, 28.3374, 32.6449])
``````

Shouldn’t the means be `0.485, 0.456, 0.406`? Why does running `normalize(normalize(vec))` give different values than just `normalize(vec)`? Apologies if I’m missing something obvious!

Hi Chanind!

No, this is not how `Normalize` works. It does not modify your tensor
to have the specified `mean` and `std`. Rather, according to the
documentation for torchvision.transforms.Normalize, it subtracts the
specified `mean` from your tensor and then divides by the specified
`std` (on a per-channel basis).

Best.

K. Frank