Torch.utils.data.dataloader + threading get stuck sometimes

Hi,

When I try to create two threads and one dataloader per thread, the following warning will come out from time to time:

OMP: Warning #190: Forking a process while a parallel region is active is potentially unsafe.

Moreover, the program can sometimes get stuck during training with two threads loading data at the same time. torch.multiprocessing can work perfectly with one dataloader per process, but I have to use threading for some reason.

I am using PyTorch 1.4.0 and Python 3.6. How can I make dataloader + threading work without getting stuck sometimes?

Thanks