How to use WeightedSampler along with a custom Sampler?

I am a using a custom sampler class similar to the one described here : How upload sequence of image on video-classification suggested by @ptrblck ( to stack up images sequentially from the datasets before feeding into the network.

Now I also wanted to use WeightedSampler to balance the classes.

Any ideas to use a WeightedSampler along with a custom sampler class ?

Thanks in advance !! :slightly_smiling_face:

I’d probably take @ptrblck’s sampler and

  • compute the class weights in __init__,
  • have a tensor that contains the class weight for each index,
  • use torch.multinomial with the weights instead of randperm.

For the latter, you can compare RandomSampler

    def __iter__(self):
        return (self.indices[i] for i in torch.randperm(len(self.indices), generator=self.generator))

to WeightedRandomSampler

    def __iter__(self):
        rand_tensor = torch.multinomial(self.weights, self.num_samples, self.replacement, generator=self.generator)
        return iter(rand_tensor.tolist())

(except you want the indices her, too).

Note that you likely want replacement=True.

Best regards


1 Like

Hello @tom, thanks for your response. :smiley:

To compute the weights, I need the count for each class. But whereas in my case the sampler is used to create the datasets too. So, only after creating the datasets i can get access to the sample count.
( I used sampler to get the indices of images and stack up images sequentially to create samples, then create datasets and load it using the DataLoader)

Do you have any suggestion for this? I can attach the code if required.

Thanks again

I don’t think there is anything wrong per se with looping over the dataset to get the class distribution. That said, if it takes a long time and you expect to run your training often, the typical thing is to make it a preprocessing step (just like e.g. the famous ImageNet mean and std for normalization have been part of preprocessing before people just kept them hardcoded).
Creating a dataloader isn’t that expensive (the expensive stuff is only done when iterating them and re-done every epoch), so there isn’t anything wrong with having one that is used to collect statistics and then creating a new one with the weighted sampler.

Best regards


1 Like

Thanks. If I understood correctly, you mean I can have one sampler for preparing datasets and a weighted sampler for iterating them ?

Yes, you can use the same Dataset but wrap it in different Dataloaders (with different RandomSamplers).

Thank you, @tom.

This helps a lot. :slightly_smiling_face: