Dataloader communication between workers

I am developing a small utility package to load data to train recurrent nets. The idea is to stream temporally coherent batches. I reuse the IterableDataset with some tweaks to allow to join over multiple streams in the same batch, and to do so with multiple workers:

In order to split the load of the numerous streams accross the workers, i try to distribute the load evenly, but this is not very ideal as some streams might be longer than others.

Ideally, i would like to communicate between different workers to access a single list of streams and be notified of which streams have already been read.

Is there a workaround with some Pytorch Data framework, like a custom sampler, or do i need to hack this myself using multiprocessing library?

EDIT: I did a first version with a multiprocessing.Mutex. It works as expected.

See here: