Not getting what's happening in my code

I am new to PyTorch and was setting up my notebook for image classification task using pre-trained inception_v3 model. I first load image using PIL and preprocess it as follows:

std=[0.229, 0.224, 0.225]

preprocess = transforms.Compose([
                transforms.Resize((299,299)),
                transforms.ToTensor(),
                transforms.Normalize(mean, std)
            ])
image_tensor = preprocess(img)
image_tensor = image_tensor.unsqueeze(0)

Here is the problem. Now, I want to display this input image (image_tensor). My next cell looks like this:

x = image_tensor.squeeze(0) #remove batch dimension
#unnormalize
x.mul_(torch.FloatTensor(std).view(3,1,1)).add_(torch.FloatTensor(mean).view(3,1,1))
x = x.numpy()
x = np.transpose( x , (1,2,0))   # C X H X W  ==>   H X W X C
x = np.clip(x, 0, 1)
plt.imshow(x)

If I run above cell it works fine and the image will be displayed. But If I run this cell again then image gets darker. If I run it once again then it gets even more darker. I have attached images. I am not getting why it’s happening and how can I stop this. I want this cell to display original input image every time I run it. Any explanation will be helpful to me. Thanks!

Screenshot%20from%202018-03-05%2009-13-46Screenshot%20from%202018-03-05%2009-14-00Screenshot%20from%202018-03-05%2009-13-54

It looks like you are modifying your data. So every time you run the cell the modification you are doing is happening again.

Hi, I understand this. But not getting where is it happening. I don’t change image_tensor in this cell. Do you know how can I stop this? Thanks!

Yea my bad. So it looks like mul_ and add_ are ‘inplace’ operations which means that the underlying data will get modified. That’s probably why this is happening. Use:

http://pytorch.org/docs/master/torch.html#torch.add

http://pytorch.org/docs/master/torch.html#torch.mul

I tried that. I also tried replacing that line with manual ops on each channel. It didn’t work.

Now, it works. I must have made some mistake.
Thanks!

1 Like