How to get random subset of original dataset for each epoch?

I would like to know if there is an option to train a model on a different random subset of a larger training set each epoch?

Currently I use Subset() to obtain a smaller version of the original data set.

    trainset = torchvision.datasets.CIFAR10()
    subset_length = 1000
    trainset = Subset(trainset, range(subset_length))
    trainloader = DataLoader(trainset)

But this subset is fix for all epochs (the same 1000 images). I would like to have a more dynamic subset, that allows me to train on a fraction of the original data set using a random subset for each epoch.

Do I have to modify the DataLoader()'s __iter__() method to achieve this? Or is there a simpler way?

The simple approach would be to create a new sampler and re-create a new DataLoader with it for each new epoch. Assuming you are lazily loading the data in the Dataset the overhead of re-creating the DataLoader should be small.