Training on multiple dataloaders

Hello everyone!

I had a question regarding training on more than one dataloaders.

So I have multiple dataloaders each containing different datatsets, and I gathered them into a list

The training loop is a while loop, with a nested for loop looping over the dataloader. However, I’m using cycle from itertools, and when the nested for loop finishes, it cycle through the next dataloader in the lsit of dataloaders.

An example code is the following where train_loader is a list containing multipe dataloaders.

This seems to work, however I wanted to check if there is any drawbacks in doing so?

training_dataloaders = cycle(train_loader)

while Train: 
    train_loader = next(training_dataloaders)    

    for batch in train_loader:

Thank you!

I don’t see a problem with this approach.