In my setting, I am using multiprocessing a lot and it turns out I would like the dataloader to lie in other processes than the ones actually processing the data
for doing this, I’v been using dataloaders, but passing them around to new processes seems to lead to deadlocks and some sync bugs, that are difficult to debug (I’m on pytorch 0.4.1)
My question is:
it seems that having the data come through a Queue object would solve the problem. Is there some built in functionality making data traveling through Queues ? In other word, it would be an alternative to dataloaders
if no, I can easily implement that, but I’m wondering if this is a feature in pytorch already ?