Hello, I’m quite new in Pytorch and I have a question
batch_size args that will determine size of data per batch and number of batches based on total data in the set. However, I’d like to manually control number of batches I’d like to run per epoch. Is there a way I could do this while still using DataLoader?
A simple approach would be to use the index of the training loop and call
break once you sampled the desired number of batches.
Alternatively, you could also split the
random_split and the desired length or use a sampler, but I think the first approach would be the easiest, if you just want to change the number of batches.
So, like this?
for i, x in enumerate(train_dataloader):
# load x and process
if i == n_batch:
It fixed my problem. Thank you very much for the help