Hi,
How do I choose the values for mead and std when using transforms.Normalize(mean, std)?
I have seen both examples where Normalize(mean=[0.5, 0.5, 0.5], std=[0.5, 0.5, 0.5])]) are used, but also cases where Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])]) are used.
How are these values found; should they be calculated from my data set or are they appropriate constants?
@void32 the values taken into consideration for mean and std for the transforms:
Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])]) is used for the ImageNet dataset
And,
Normalize(mean=[0.5, 0.5, 0.5], std=[0.5, 0.5, 0.5])] is used to convert input image colors to gray scale
You can check Understanding transform.Normalize( ) for a better insight
Hi Hemant,
Thanks for your quick answer.
Okay, it’s simply the mean and std calculated from the ImageNet dataset, so in theory, I should calculate for my own data?