Can pytorch dataloader prefetch training samples

Hi,

I am trying to understand how the dataloader works, and I come in to a question: can pytorch prefetch the training samples during training, so that the model do not need to wait for the data before one round of training?

If pytorch has this function, how could I switch on or switch off it?

If you are using multiple workers (num_workers>=1), the next batches will be prefetched once the code enters the DataLoader loop.