Will DataLoader pick always the same samples if a seed is used?

My code follows this structure


def train_epoch(epoch):
  # assume it loads 1000 samples
  # using 10 samples in batch - so it will have 100 batches
  train_loader_long = self.DataLoader(...)
  
  # assume it loads 100 samples
  # using 10 samples in batch it will have 10 batches
  train_loader_short = self.DataLoader(...something else..)
  
  # this should be 10
  min_len = min(len(train_loader_short), len(train_loader_long))
  
  for i in range(min_len):
    batch_1 = next(iter(train_loader_long))
    batch_2 = next(iter(train_loader_short))
...

After 10 batches I will see all sample of loader_short, but only 10% of the loader_long. My question is:

When I call the train_epoch() function for 100 epochs, will it take 100 times SAME samples for the loader_long or will it randomly pick some samples each time. In the latter case, I would (in probability) see all samples if I use sufficiently many epochs.

If you do not set the seed again inside the train_epoch method and use shuffle=True then random samples should be returned.