Suppose a dataset is given and a RandomSampler is created by
Afterwards a DataLoader is initialised via
auto loader = torch::data::make_data_loader(
After a couple of epochs, there is no way of saving the state of the DataLoader’s internal sampler anymore.
I looked at “StateLessDataLoader” and “DataLoaderBase”, but since the sample_ is a private member, there is no access to its load or save functionality.
Might there be a way to “fast foward” the DataLoader? But calling “next()” will load samples, which is slow.
Thanks in advance!