Dataloader and fixed batch size when generating multiple datapoints per input

I have a large dataset of sentences, for each sentence I sample an arbitrary number of labels. Therefore, each sentence yields a set of (sentence, lbl) pairs. I would like to use a DataLoader to efficiently fetch batches of N (sentence, lbl) pairs, where N is the batch size. Without the dataloader, I would simply read sentences and generate pairs as long as my batch is not full:

q = queue.Queue()
while training:
    s = get_next_sentence()
    add_pairs_to_queue(q, s)
    if q.size() >= batch_size:
        yield [q.get() for _ in range(batch_size)]

I cannot pre-generate all the pairs. Any idea on how I can have a similar behavior using a DataLoader ? I can implement a dataset class which generates all the pairs for one sentence and then give a custom collate_fn to the DataLoader, but I will end up with batches of arbitrary lengths.