I am trying to create a training loop in a reinforcment learning type setting. I have a dataset object which is continously growing as data is being added to it during the execution. I simultaneously want to train a model by sampling batches from this growing dataset using a dataloader however the dataloader seems to not register the new data being added to the dataset. Is there a way to achieve this using the Dataset and DataLoader classes, if not how can I achieve this efficiently?
I was just wondering can you prepare a loop and create a dataset loader every time to capture the expanding data?
Hi, thanks I did consider this, but while searching for an answer i came across a comment " DataLoader iterators are not meant to be very short lived objects" Get a single batch from DataLoader without iterating · Issue #1917 · pytorch/pytorch · GitHub, which made me think this would not be a good idea though im not absolutely sure.
What I meant, you can keep two loops, one loop for Dataloader, and a child loop to iterate over the data, once done del the dataloader and reload for new epoch training.