Memory error in list of list of list of lists

Hi
I have a 3D list containing matrices of dimension 130x130 resulting in memory error. If i go with the average dimension of my data ie 14x14 , it is created and can be converted into numpy arrays without any issues. Could anyone pls tell what is the reason for memory error when i go with size 130x130 for each model. I have total 1,18,000 models , each having three 130x130 matrices. What is the efficient way of giving this data to pytorch dataloader for training with cnn?