I’m going to load a dataset that is about 16G in colab, but as you know, due to low RAM space it is not possible. The best solution (in my opinion!) would be to load data in batches (my data is located in my Google Drive). All in all, I cannot write a custom data loader to do it for me. (load data from my Google Drive and in specified batches). Can any one help me?
Note: My data are in .npy form
Thanks for your answer.
I couldn’t understand that how can I load them in batches? (in other words, what parameter defines the batch size?).
Moreover, I’ve converted my data and its labels into a single tensor file using TensorDataset and then save it in my google drive (using torch.save) . Is it possible to load it in bathces??