Training model with large dataset on a GPU with insufficient Memory (effeciently)

Write a custom data loader to do lazy loading, that is, at __init__ just load the list of data ( this could be path information, row/column/line information of each data. The actually .npy data is not loaded yet and with the help of that list, load data batch-wise using the __getitem__.

Check the link below, they do something very similar!

1 Like