How to use my own sampler when I already use DistributedSampler?

Just found DistributedSamplerWrapper from here. It allows you to wrap DistributedSampler on the top of existing sampler. Might be good feature to add in PyTorch!

9 Likes