Transform resize not working

    transform = transforms.Compose([transforms.Resize(224),
                                    transforms.ToTensor(),
                                    transforms.Normalize(mean = [0.5, 0.5, 0.5],
                                                         std = [0.5, 0.5, 0.5])])


    train_dataset = torchvision.datasets.ImageFolder(root=DATASET_PATH + '/train/train_data', transform=transform)

    train_loader = DataLoader(train_dataset, batch_size=32, shuffle=True, num_workers=2)



    print("Found {} number of mini-batches".format(len(train_loader)))

    t0 = time.time()
    total_step = len(train_loader)
    for epoch in range(nb_epoch):
        
        avg_loss = 0.0
        
        for i, (images, lables) in enumerate(train_loader):
            images = images.to(device)
            labels = lables.to(device)

RuntimeError: invalid argument 0: Sizes of tensors must match except in dimension 0. Got 224 and 475 in dimension 2 at /pytorch/aten/src/TH/generic/THTensorMath.cpp:3616

I tried to apply transform Resize and loaded during training and this error came up. It says that image sizes are different. How is this happening even though I resized all the images in the same size??
Any advice would be welcome!

If your images are of different sizes and not squares, the smaller edge is matched to the given size, which means that in this case there is no guarantee that all images have all edges of the same size.
See here: https://pytorch.org/docs/stable/torchvision/transforms.html#torchvision.transforms.Resize

1 Like

Thanks for your reply. Then what else can I do when I load datasets?
I’m working on image retrieval problem. So the image sizes are all different.
for example, there are 400x800, 600x800, 700x400, 400x300 and many more.
How can I decide the right size for the network??

1 Like

I should’ve mentioned that you can create the transform as transforms.Resize((224, 224)). If you pass a tuple all images will have the same height and width.
This issue comes from the dataloader rather than the network itself. When the dataloader creates the batches it expects all tensors to have the same shape.

5 Likes

it’s not working for me

you can do transforms.Resize((224, 224))