Not enough memory

Hello,

I made an attempt to create a CNN with PyTorch based on Kim et al Convolutional Neural Networks for Sentence Classification.

I run the model on the CPU on a machine with 32GB of ram and I keep getting “a not enough memory error”:

RuntimeError: $ Torch: not enough memory: you tried to allocate 23GB. Buy new RAM! at /opt/conda/conda-bld/pytorch_1503970438496/work/torch/lib/TH/THGeneral.c:270

It indeed looks as if 23GB are being allocated but I don’t understand why I am getting a out of memory error.

On a side note: Is there a glove - embedding layer howto available anywhere? I need to verify what I did is correct since I am new on the topic

If it really is out of memory, then it should be normal for you to see a OOM. Is there a reason you expect otherwise?

Running large CNNs in CPU is especially memory demanding. If you have a GPU, use that with cuDNN instead.

1 Like

Monitoring the memory looks as if pytorch already allocated 23GB before throwing the OOM error. Let along that the machine I am running this on has 32GB of RAM available. (If you see the graph in my previous post it shows the memory allocated on the machine at the time of the OOM error)

Unfortunately I’d love to run the CNN on my GPU but the memory is a more important factor there. I have two GTX 1070 with 8GB ram each but I haven’t figured if I can use the memory of both of them in the same training

I see. I’ll have to check the code to see what values it uses in the error message.

cuDNN uses a lot less memory (and is faster). That is the reason I suggested so :slight_smile: Even just with CUDA it might use less memory than CPU.