How to load training dataset sectionally from Hard and train the model with these data respectively?

Hello Everyone.
I want to train a CNN Model that the training dataset is too massive. When I’m training the model, because of massive data, the system is suffered from out of memory issue.
I don’t want to load the dataset from hard disk completely.
Is there a solution that I load the dataset sectionally from hard disk and respectively train the model from each part of data ?

What kind of data do you have?
Images sould be no problem, since you can load each image separately.
For .csv data, this thread might be helpful.

1 Like

My Dataset is kind of images.
Can I Use ImageFolder for this problem ?

Yes, you could use ImageFolder for this, just make sure your data is stored in the needed folder structure.
Alternatively, you could write your own Dataset and load/preprocess the images in the __getitem__ function.
Here is a nice tutorial on data loading.

1 Like

Thanks a lot for your help. I check all of your guidances.