Do I have to explicitly call the dataloader in each epoch for training augmentation?

Hi. I’d like to (virtually) augment my data for training. I just want more random samples and to do this I want to train the network for a longer number of epochs. I’m wondering if for every epoch my dataset will be transformed differently or if I have to explicitly call the dataloader in each epoch

The transformations are set in the Dataset and are applied on the fly for each sample.
If you are using random transformations, each sample will be randomly transformed in each epoch.

You don’t have to recreate the DataLoader in each epoch.

1 Like