Manually set number of batches in DataLoader

Hello, I’m quite new in Pytorch and I have a question

In DataLoader, there’s batch_size args that will determine size of data per batch and number of batches based on total data in the set. However, I’d like to manually control number of batches I’d like to run per epoch. Is there a way I could do this while still using DataLoader?

Thank you

A simple approach would be to use the index of the training loop and call break once you sampled the desired number of batches.
Alternatively, you could also split the Dataset using random_split and the desired length or use a sampler, but I think the first approach would be the easiest, if you just want to change the number of batches.

So, like this?

for i, x in enumerate(train_dataloader):

    # load x and process

    if i == n_batch:
        break

Yes, that should work.

It fixed my problem. Thank you very much for the help