Libtorch: reset dataloader result in segfault (trying to get fresh iterator)

In libtorch, I want to get a new iterator from the beginning of the DataLoader before consuming the previous one. I do this because I want to test on the same data over and over again so I don’t have to worry about test data sample variance.

If I simply call current_iterator = data_loader_->begin(); I get this error:

Attempted to get a new DataLoader iterator while another iterator is not yet exhausted.

pytorch/torch/csrc/api/include/torch/data/dataloader/base.h: 60

If I call data_loader_->reset() before this reassignment I get a segfault.

What do I gotta do to get a fresh iterator?

Right now the work around I use is to reconstruct the data loader every time I need a fresh iterator. Given that the DataLoader class as a reset method and getting a fresh iterator probably shouldn’t involve this much work, this is still an open issue for me.