How to load a huge matrx as dataset

hey guys,new to pytorch.
I use Matlab to make a 2-d matrix which have a huge size, almost line is a sample.
it’s a big file, when I want to use it to train , pytorch says it was out of memory. so I need to us torch.utils.dataset,
but in the tutorial , the example is load image, so how to load a big file like NLP?

Maybe look at the torchtext package