in ImageNet examples, the codes look like:
for epoch in range(all_epochs):
train_sampler.set_epoch(epoch)
train_one_epoch_with_train_loader()
how is the sampler.epoch inside train_loader set by the outside definition train_sampler.set_epoch(epoch)
? Seems that the train_loader is not explicitly re-constructed in every epoch.
Didn’t find much hints in /torch/utils/data
Thanks!