I want to use dataloader of pytorch with shuffle=True; but at the same time I need to use the shuffling indices. Is it possible to access which samples of data is used in a batch when shuffle=True?
Thanks.
I want to use dataloader of pytorch with shuffle=True; but at the same time I need to use the shuffling indices. Is it possible to access which samples of data is used in a batch when shuffle=True?
Thanks.
Indexing over DataLoader suggests to use DataLoaderIter.sample_iter but it is not clear to me how.
I have used the following as suggested in Indexing over DataLoader.
DataLoaderIter = iter(trainloader)
for i in enumerate(DataLoaderIter.sample_iter) :
print i
But, interestingly this does not return the indices of the first two batches. Do you how can I fix this?
Thanks.