In libtorch, I want to get a new iterator from the beginning of the DataLoader before consuming the previous one. I do this because I want to test on the same data over and over again so I don’t have to worry about test data sample variance.
If I simply call current_iterator = data_loader_->begin();
I get this error:
Attempted to get a new DataLoader iterator while another iterator is not yet exhausted.
pytorch/torch/csrc/api/include/torch/data/dataloader/base.h: 60
If I call data_loader_->reset()
before this reassignment I get a segfault.
What do I gotta do to get a fresh iterator?