Is there a way to train independent models in parallel using the same dataloader?

Maybe you could use a Queue as described here or a simple implementation of a shared array as given in this example.

1 Like