How can I get just one batch of DataLoader.torch.utils.data with fix index each time?

Hi everyone,

I am new in PyTorch, I want to get just one batch from DataLoader and have the index of samples as well. and every time I want just to use these index ids (without changing ids). any idea?

Thank you in advance.

@Nazila-H Is your question in the context of using DataLoader specifically in a distributed training setting, or is it about more general DataLoader behavior? If it’s the latter, there may be folks with more insight into this area if you post in the Uncategorized topic.

Yes, it is in distributed framework.
I want to have the same indexes of the samples in three different virtual machines, can I modify shuffle in: DataLoader.torch.utils.data or define a fix number for a kind of random seed in each machine?