There isn’t a way to do this directly.
However, if you modify this file slightly, it’ll be possible: https://github.com/pytorch/pytorch/blob/master/torch/utils/data/dataloader.py#L176-L201
Particularly, see the next
function of the DataLoaderIter, you could enumerate self.sample_iter
fully before hand (the indices of each mini-batch), and then you can have a function on that iterator that just returns a particular index
.