For `DataLoader` , set `batch_size=None` and use `sampler=` for your new sampler.
Do you mean batch_sampler=
instead of sampler=
? I learned from the doc that batch_sampler=
yields batch of indices a time, which suits this case. Here’s the doc: https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader