transforms.Resize

The default input image size of resnet network is 224 * 224, but the image size of my data set is 200 * 200. How should I adjust it during preprocessing? Is it possible to use transforms.Resize (224)?

class mydataset(data.Dataset):
    def __init__(self,root,transforms=None):
        ...
        transforms = T.Compose([
                    T.Resize ((224,224)),
                    T.ToTensor(),
                    T.Normalize(mean = [0.485, 0.456, 0.406], 
                                     std = [0.229, 0.224, 0.225])
                    ]) 
    def __getitem__(self,index):
        ...
        data = self.transforms(data)
        return data, label

2 Likes

Thank you very much! So is this to enlarge the 200 * 200 image to 224 * 224 first?

Yes @ky_Pa you are right.
In the above code you might need to write T.Resize ((224,224)) ,
If size is a sequence like (h, w), output size will be matched to this. If size is an int, smaller edge of the image will be matched to this number. i.e, if height > width, then image will be rescaled to (size * height / width, size)

Thanks for your answers!

1 Like