Is there a way to load a single big (97.2 gb) file using torch.load?

I’ve only 16gb CPU, and I cant find a way to load a 97.2gb dataset. Batching is made down the road, as I understand, with the 1st step of data loading using torch.load

Is there a way to somehow split the file into pieces ?