Batch control during training

Hello, i have a question about pytorch training, in current case. i can set batch size and provide image size for every images by reload data.Dataset class. but that means i must set same shape for all batches.
can i decide data shape for different batch?, for example 1 batch 1x3x25x25, 2nd batch 1x3x50x50, 3rd batch 1x3x100x100. Plz help me! thank you!

Hello, yes this is possible if you write a custom collate function for your dataloader. Have a look for an example below, but you might have to change it a little bit :slight_smile: Let me know if this is unclear, I’m just heading to sleep

def collate(batch):
    im_sizes = [25, 50, 100]
    im_size = # select size from im_sizes
    uniform_size = transforms.Compose([
      transforms.Resize((im_size, im_size)),
      transforms.ToTensor(),
      transforms.Normalize(mean=[0.485, 0.456, 0.406],
                           std=[0.229, 0.224, 0.225])
    ])

    ims = [uniform_size(b) for b in batch]
    return torch.stack(ims)

ok, i reload dataloader and reslove this problem. thank you!