Loading a Tensor from file in batches

Based on this older post it seems that you could use a Storage to load the data in chunks.
However, I don’t see an offset argument, so I guess the proper way would be to use np.memmap and load chunks of a numpy array (assuming you could store it via numpy).