I cannot find the link anywhere but I read once that if we set pin_memory = True
in the pytorch data loaders then the num_workers
threads should be set to 1.
Is this true? I am not sure if I am misremembering. If yes, what is the reason behind not being able to use multiple threads when using pin_memory
. It seems very counter intuitive that this would play a part as every thread would be copying data to a separate memory area.