Pin_memory and num_workers in pytorch data loaders

I cannot find the link anywhere but I read once that if we set pin_memory = True in the pytorch data loaders then the num_workers threads should be set to 1.

Is this true? I am not sure if I am misremembering. If yes, what is the reason behind not being able to use multiple threads when using pin_memory. It seems very counter intuitive that this would play a part as every thread would be copying data to a separate memory area.


I think you’re misremembering. The two options are independent. And the num_workers is a number of processes, not threads.

1 Like