Concat dataloaders for a unified interleaving dataloader?

I Try to train on two dataloaders, one attached to a dataset where each __get_item__ call fetches a predefined batch of varying length (thus the batch_size I transfer to the dataloader object is 1), and one where I sample randomly from a set of sequences, thus __get_item__ call fetches one sample each time.

I’m looking for something like

            loader = DataLoader(
                batched_dataset,
                batch_size=1,
            )
            loader_tdm = DataLoader(
                random_samples_dataset,
                batch_size=8,
            )
            from data.concat_dataloaders import AlternateIterator
            loader = AlternateDataloader(loader,loader_tdm)

Is this possible?

You could create the iterators directly via:

iter0 = iter(loader)
iter1 = iter(loader_tdm)

and either use some utility functions from itertools or manually alternate between these iterators.
Note that you would have to manually take care or recreating the iterators if they run out of samples.