May be Memory Leak on CPU

Need HELP!!!
I’m running this code here on Ubuntu 16.04 with python 3.5.x and RAM = 8GB
For each epoch the RAM Memory consumption is increased by around 200 MB and my laptop start to get hang. Can somebody tell me what may be I’m doing wrong??

I Tried the same code on Mac and it was working properly.

EDIT: The problem has been solved.

What is out.backward(out)

Hey @Federico_Pala i think i got it why it was not working properly . I replaced out.backward(out) with out.backward(torch.ones(*out.size(0)) and RAM consumption is not increasing any more.

Hey could you give me any insight why passing the Variable out in backward() call was increasing the memory Consumption. :slight_smile:

Don’t know, you are missing the loss function!

Also your net should be on train, not eval

I was trying to implement Deep Dream in pytorch and that’s why i was passing “out” to the out.backward() so that i can get gradient of my input image with respect to output.

I found that if you pass a Variable object to backward() then my RAM memory consumption is increasing by each iteration (don’t know why) . Therefore now I’m passing out.data which is torch.Tensor type and It’s working properly :slight_smile: .

Pytorch build graphs when doing operations on variables, probably it was building a strange big graph! Good luck, you got me, i was nervous about your code hehe. sweet conv dreams!

http://pytorch.org/docs/master/autograd.html?highlight=backward#torch.autograd.backward

create_graph (bool, optional) – If true, graph of the derivative will be constructed, allowing to compute higher order derivative products. Defaults to False, unless grad_variables contains at least one non-volatile Variable.

It will create a backward graph because out is a non-volatile Variable.

Thanks @ruotianluo now understand why memory was increasing with each iteration.