How to keep samples in each mini batch unshuffled?

Hi,
I need to have sequential data samples in each mini batch (frames of a video). Using torch.utils.data.DataLoader, and shuffle =true, it shuffles data indices within each mini batch, and shuffle=false return the mini batches in order.
How can I have random mini batches containing unshuffled data samples?

I used new_dataloader = random.sample(list(dataloader) , len(dataloader)), but this occupies a huge memory. Is there any other mechanism in Pytorch?

I hope you can help me:)

You could either load the sequence in the __getitem__ method while the DataLoader and sampler are only providing the “start index” (you would need to adapt the __len__ of your Dataset for this approach) or you could create a custom sampler, which will yield the desired “shuffled” indices through a BatchSampler.
Here is a simple example of the first use case.