How dose distributed sampler passes the value "epoch" to data loader?

in ImageNet examples, the codes look like:

for epoch in range(all_epochs):
    train_sampler.set_epoch(epoch)
    train_one_epoch_with_train_loader()

how is the sampler.epoch inside train_loader set by the outside definition train_sampler.set_epoch(epoch)? Seems that the train_loader is not explicitly re-constructed in every epoch.
Didn’t find much hints in /torch/utils/data
Thanks!

The sampler is passed as an argument when initializing the DataLoader, so the train loader will have access to the sampler object. Neither the loader not the sampler need to be re-constructed every epoch.