Stack expects each tensor to be equal size -- error

Hi,

I am getting this error which I understand why is happening but I don’t know how to fix it. :confused:

I am using a custom dataset to load images from a directory:

class DiffPatternsDataSet(Dataset):
    def __init__(self, main_dir):
        self.main_dir = main_dir
        self.all_imgs = os.listdir(main_dir)

    def __len__(self):
        return len(self.all_imgs)

    def __getitem__(self, idx):
        if torch.is_tensor(idx):
            idx = idx.tolist()
        img_loc = os.path.join(self.main_dir, self.all_imgs[idx])
        image = torch.from_numpy(np.genfromtxt(img_loc)).unsqueeze(0)
        filename = self.all_imgs[idx]
        tiles = split_pattern(image, tile_size=80, stride=2)
        return tiles, filename

The split_pattern function basically splits an image using the Unfold function, then eliminates the patches that contain only white noise. Therefore, for each image I get a tiles tensor of dimension (#tiles, channels, width, height) and the number of tiles is different depending on the image. So when I use the loader, it wants to do a torch.stack and finds tensors of different dimensions.

Is there any solution for this? Maybe I am not approaching the problem correctly. I want to train my model with the tiles I get from the images (plus each tile needs to have the original image filename attached). Maybe I should do things differently in the getitem?

Any help or input will be greatly appreciated!

Just for future reference if anyone sees this: I solved my problem by using a custom collate_fn for the loader. Instead of using torch.stack I generated a list with all my tiles so the tensor size mismatch was no issue.

Would you be able to share your custom collate_fn code please?