VGG: Exploding RAM with simple code, bug?

Hi all,
Please find attach a very simple code that just stocks VGG outputs:
http://pastebin.com/8Jq21KWD
It requires over 5 GB of RAM on my laptop. And this after that the model is loaded (in the loop).
Any ideas :slight_smile: ?

Hi,

You are keeping track of the full history of the outputs by saving the output variable,
You might want to do instead

l = []
for i in range(10):
    out = vgg19(Variable(torch.randn([1,3,224,224])))
    l.append(out.data)

Thank you very much that solved the problem indeed!
Need to get a deeper look on pytorch architecture :))

What happens is that pytorch allocates the intermediate buffers on demand, and frees then as soon as they go out of scope.
Variables keep the history of all the computations that were performed before it.
A common mistake is to do current_loss += loss, and this will not free the memory because you will be keeping track of the whole history of computations. You should do instead current_loss += loss.data[0] for example.
Also, if you only want to perform forward pass computations, using volatile variables will save you a ton of memory.

Variable(torch.randn(1,3,22,224), volatile=True)

Thank you very much for your explanation!