Changing dataset during training, adapting len

Hi, we would like to go easy with our model and show him easy samples for the first few epcohs and in every few epochs to increase the hard samples.
We code the solution by

class ImageFolderNew(ImageFolder):
    def __len__(self) -> int:
        return len(self.filtered_imgs)
   
    def filtered_images(self,p=0.1):
        self.filtered_imgs = {}
        for k in range(len(self.imgs)):
            path = self.imgs[k][0]
            if condition not fulfils:
                self.filtered_imgs[k] = self.imgs[k]
    def __getitem__(self, index):
           #Here we work on filtered_imgs not self.imgs!!
            path = self.filtered_imgs[index][0]#imgs[index][0]

This solution works well, the only issue is than getitem_ stil samles from (0,len(self.imgs)) and not from (0,len(self.filtered_imgs)), how can we make getitem understand that we have less samples in this epoch, but in the next epoch we will have more.

I’ll be happy for your help!
Thanks!!

One solution can be torch.utils.data.IterableDataset where the length of datasets and batch size can be controlled by custom user-defined function in dataset class.
https://pytorch.org/docs/stable/data.html#iterable-style-datasets