Hello, I’m quite new in Pytorch and I have a question
In DataLoader, there’s batch_size args that will determine size of data per batch and number of batches based on total data in the set. However, I’d like to manually control number of batches I’d like to run per epoch. Is there a way I could do this while still using DataLoader?
A simple approach would be to use the index of the training loop and call break once you sampled the desired number of batches.
Alternatively, you could also split the Dataset using random_split and the desired length or use a sampler, but I think the first approach would be the easiest, if you just want to change the number of batches.