Increase num_workers stops my model from learning

Hello. I generally run my experiments with num_workers set to 1 in the DataLoader. Recently I got a slightly more demanding dataset to generate and I thought of speeding things up by setting num_workers to 8. Strangely, this stopped my model to learn completely. Why is this the case? My datasets are generated on the fly, if that’s relevant.

And, more generally, how does num_workers work exactly? Could you point me to some docs? Each different worker works on a different batch and then they are collated together?

And, maybe related: since I write code on windows, all my code is wrapped around

if __name__ == '__main__':
    freeze_support()
    .... 

would this afftect performance when ran on a non-windows machine?
Thanks

Are you running that on Windows or Linux? What you mean by stop learning?

Linux. The loss is flat