Iterating through the generator crashes when I add normalization

Hello,

I had a piece of code that was working fine but when I added a normalization step in my “transforms”, it crashed.

Here’s my code for creating the generator :

data_transforms_train = transforms.Compose([
    transforms.Resize((512, 512)),
    transforms.RandomVerticalFlip(),
    transforms.RandomHorizontalFlip(),
    transforms.ToTensor(),
    transforms.Normalize(mean=[0.8979], std=[0.3025])
])
    
params = {'batch_size': 1,
          'shuffle': True,
          'num_workers': 0}

training_set = Dataset(partition['train'], labels, transform=data_transforms_train)
training_generator = DataLoader(training_set, **params)

Again if I remove the line “transforms.Normalize(mean=[0.8979], std=[0.3025])”, everything goes back to working fine.

Any suggestion on what could be causing this ?

Thanks !

Hi,

What is the error that you get? Have you tried putting the normalize before the ToTensor() ?

Hello @albanD,

I try running the following in a jupyter notebook :

for test_images, test_labels in training_generator:  
    sample_image = test_images[0]   
    sample_label = test_labels[0]

And I just get a message that the kernel crashed and it will be restarting.

It is very hard to say without more informations.
You might want to run it outside of the notebook to get more informations.