Num_workers and batch size

it looks like when num_workers>batch_size, the same data will be fed to the neural net. I was using num_worker = 8 and batch_size = 1, and the same example is fed as input for multiple times. I think it’s a bug or something.

Could you post a reproducible code snippet to show this behavior, as this simple example works without repeating values:

dataset = TensorDataset(torch.arange(100))
loader = DataLoader(dataset, batch_size=1, num_workers=8)

for data in loader:
    print(data)