I want to train my model using the same batch with DataLoader, but when I assigned train_loader = DataLoader(batch_size = any_size) and use iter() and next() on train_loader then for different epoch it takes the different batches as expected. Now if I want to fixed the same for the iterative epoch how to fixed batch to run everytime. I checked with small batch size and since the dataset is much bigger, as I printed the batch size input data it prints different batches until the data ends. How to fixed the batch for training?
try to disable shuffling, i.e. DataLoader(dataset, batch_size=64, shuffle=False). Alternatively, you could try to set the random seed(s) to a fixed value which would make your experiment reproducible.
Luis solution is fine,
For even more customization, consider looking up on how to write a custom Sampler.