@Harry-675
I’m really sorry it might not work. (I’m not sure)
If the above doesn’t work, try the below please.
More naive solution is that preparing DataSet and DataLoader for each folder. Then you loop over all the dataloaders like this Train simultaneously on two datasets if you don’t care the order of sampling in each folder.
So,
class ImageData(torch.utils.data.DataSet):
def __init__(self, root='train/folder_1', loader=image_load_func, transform=None):
self.root = root
self.files = os.listdir(self.root)
self.loader = loader
self.transform = transform
def __len__(self):
return len(self.files)
def __getitem__(self, index):
return self.transform(self.loader(os.path.join(self.root, self.files[index])))
loader_1 = DataLoader(ImageData('train/folder_1'), batch_size=3)
...
loader_8 = DataLoader(ImageData('train/folder_8'), batch_size=3)
for batch in zip(loader_1, ..., loader_8):
batch = torch.cat(batch, dim=0)