Hi, thank you always for your help.
I am writing a dataloader using torch.utils.data.DataLoader class.
I want to load at once the same indexed files from different folders.
For example, provided I have folder 0, 1, …, 9, and each folder have the same indexed files from 00 to 99 like this:
folder 0 — 00.txt, 01.txt, …, 99.txt
folder 1 — 00.txt, 01.txt, …, 99.txt
…
folder 9 — 00.txt, 01.txt, …, 99.txt
I prepared a loading list file where the indices are shuffled, and then tried to load them as usual:
for batch_index, inputs in enumerate(self.data_loader):
process(inputs)
What I expect is, if the loading list file indicates 07.txt, then the loader takes all the 07.txt files from the total 10 folders above. But it seems that all these files are shuffled. I guess this is due to the parallel computing.
Is there any good way to load correctly?
Thank you in advance.