Setting seeds does not give reproducible code

@ptrblck Actually I am getting non-deterministic values for multiple outputs. Upon searching for solutions, I got a post here that says setting num_workers = 0 for Dataloader works. I tried that but it still doesn’t work for me. But by making a few changes I am thinking it has something to do with the Dataloader part of my code. Here is the snippet:

content_iter = iter(data.DataLoader(
    content_dataset, batch_size=4,
    sampler=InfiniteSamplerWrapper(content_dataset),
    num_workers=16))
style_iter = iter(data.DataLoader(
    style_dataset, batch_size=number_of_styles,
    sampler=InfiniteSamplerWrapper(style_dataset),
    num_workers=16))

Here, number_of_styles is 19.
And the InfiniteSamplerWrapper method is defined in a different file as follows:

import numpy as np
from torch.utils import data

def InfiniteSampler(n):
    # i = 0
    i = n - 1
    order = np.random.permutation(n)
    while True:
        yield order[i]
        i += 1
        if i >= n:
            np.random.seed(1)
            order = np.random.permutation(n)
            i = 0


class InfiniteSamplerWrapper(data.sampler.Sampler):
    def __init__(self, data_source):
        self.num_samples = len(data_source)
        #print("Num samples:",self.num_samples)

    def __iter__(self):
        return iter(InfiniteSampler(self.num_samples))

    def __len__(self):
        return 2 ** 31

If I leave np.random.seed() as it is without specifying any number inside the braces, I get all different values from the beginning. Whereas, if I use np.random.seed(1) I get the exact values for the first iteration, then it starts to change from the second iteration.