Pytorch-lightning: pre-fetch 1 batch disable


I am facing a problem of saving current batch output as the input for next batch since the dataloader always load one batch ahead before current training gets output.

“When iterable datasets are used, Lightning will pre-fetch 1 batch (in addition to the current batch) so it can detect when the training will stop and run validation if necessary.”

I wonder is there any way I can disable this pre-fetch? or is there any way I can save current output first and read the output as the input for the next batch?


The DataLoader allows you to set the prefetch_factor and disable it if needed. I don’t know how Lightning handles it.