Loading a file in chunks using `torch.load`

I am working on a problem where I need to have my entire dataset as one big .pt file on the disk. But since that file might exceed the size of either my GPU or RAM, I want a possible way to load it in chunks using torch.load.

Is there a way to do that? If not, is it a good potential feature request?

You could store the tensor as a numpy array and use numpy.memmap to access chunks of the array.