Type of transforms.Normalize()?

after creating normalize transform like this

    normalize = transforms.Normalize(
        mean=[0.485, 0.456, 0.406],
        std=[0.229, 0.224, 0.225]
    )
    print(type(normalize))

I get type

  • <class ‘torchvision.transforms.transforms.Normalize’>
    Is it ok or not? Asking because on another machine I was getting
  • <class ‘torchvision.transforms.Normalize’>
    and my code was running properly, now it isn’t…

The class is fine.
What kind of error do you get?
Your code works in this simple example:

normalize = transforms.Normalize(
        mean=[0.485, 0.456, 0.406],
        std=[0.229, 0.224, 0.225]
)

x = torch.randn(3, 24, 24)
y = normalize(x)

A timely response from me! :slight_smile:
Anyways, I am not sure if I should get

<class ‘torchvision.transforms.transforms.Normalize’>

or

<class ‘torchvision.transforms.Normalize’>

Asking because my code is not running properly on the new machine, but it was fine earlier.

The Error I get is

TypeError: pic should be Tensor or ndarray. Got <class ‘torch.DoubleTensor’>.

I can provide the full code if need be.

If I print the class, I get:

print(normalize.__class__)
> <class 'torchvision.transforms.transforms.Normalize'>

As far as I remember this error message might indicate you are passing an image tensor containing a batch dimension.
Could you check that and if it’s the case, pass the tensor as [channels, height, width]?

Hi. May I ask, how to define the mean value and std value for each image channel?
Moreover, can we set a parameter to make the CNN find the optimal parameter (mean value, std value or other weights/biases used in each channel) for the image processing? If so, can you tell me how to set the parameter?

You could calculate the mean and stddev from your training dataset. Alternatively, you could use the ImageNet stats if your dataset is from a similar domain (natural images).
Often just 0.5 works fine so you would have to try it out.

I’m not sure if it’s a good idea to learn these preprocessing steps as they often are a good way to speed up training or make it feasible in the first place.

ok. I got your idea. Thanks a lot.