IterDataPipe and DDP

In the documentation it seems that using torch.utils.data.distributed.DistributedSampler is not compatible with an Iter-style dataset. I was just wondering if there is any way of making it compatible.
Some people have suggested some solutions when using multiple workers in IterDataPipes.