Best practice for working with a lot of data (files)? How to store and load data?

I am currently working on an object detection pipeline with the xView satellite image data set inside a google colab notebook. Therefor i do have a folder with all the images on my google drive folder which i mount into the notebook. Since there are 80k+ images in one single folder it (obviously?) comes to errors when loading data from the folder.

How are you guys storing / loading big data sets into pytorch?