Updating weights of WeightedRandomSampler b/w epochs

I would like to update WeightedRandomSampler.weights between epochs, e.g. for curriculum learning. A look at its class implementation suggests that I can just update the class field “weights” in b/w epochs, because the __iter__ method simply draws from the multinomial distribution - with self.weight as input. Is that correct, or are their broader implications that I’m missing, e.g. something in the Sampler class?

It might work if you manipulate the weights attribute, however I also think you shouldn’t see any performance regression if you create a new sampler and a new DataLoader after the epoch, as it should be relatively cheap compared to the iterations.

Thanks @ptrblck, you’re right on both accounts. Updating the weights attribute worked, and I could have just created a new DataLoader and sampler.