CIFAR 10 renormalization issue with PIL saved images

Hello fellows,

I do have an issue to properly display images with CIFAR10 and PIL.
My dataset is loaded using torchvision.datasets as follow (mean and std values are from the CIFAR 10 standard mean and std values):

        dataset=datasets.CIFAR10('CIFAR10', train=False, 
                       transforms.Normalize(mean=[0.485, 0.456, 0.406],
                                            std=[0.229, 0.224, 0.225])

Later in my code, I want to unnormalize some image to save them on disk and visualize them later:

I thus follow the definition of transforms.Normalize at to obtain my original images:

def convert_cifar10(t,pil):
    """Function to convert a cifar10 image tensor (already normalized)
    onto a plotable image.
    :param t: image tensor of size (3,32,23)
    :type t: torch.Tensor
    :param pil: output is of size (3,32,32) if True, else (32,32,3)
    :type pil: bool
    im = t.detach().cpu()
    # approximate unnormalization 
    im[0] = im[0]*0.229 + 0.485
    im[1] = im[1]*0.224 + 0.456
    im[2] = im[2]*0.225 + 0.406
    if not pil:
        im = im.numpy()
        im = np.transpose(im,(1,2,0))
    return im

The minimal example is the following:

dataiter = iter(dataset)
data, label =
original_img = convert_cifar10(data[0],pil=False)
second = convert_cifar10(data[0],pil=False)

The first plot displays something normal (cannot diplay it as a new user, but they are for sure the original dataset images).
The second one is much less coloured:

I may have applied twice the transformation on the input tensor, but the documentation explicitly states that the operation is made out of place:
This transform acts out of place, i.e., it does not mutates the input tensor.

Is there anything I have misunderstood?

Thank you in advance :slight_smile:

My mistake, I double checked my code today and I realized I used the .cpu() routine. According to the documentation at

Returns a copy of this object in CPU memory. If this object is already in CPU memory and on the correct device, then no copy is performed and the original object is returned.

So I was just normalizing twice my inputs, since the second call returned me the already unnormalized image.

A solution to work on a copy of the tensor is to use torch.Tensor.clone().