How to free the graph after create_graph=True

Hi,

I use autograd.grad function with create_graph=True. After I finish, I want to release the GPU memory the created backward graph. How can I do it?

2 Likes

Easy, just check out the official docs:

  • retain_graph ( bool , optional ) – If False , the graph used to compute the grad will be freed. Note that in nearly all cases setting this option to True is not needed and often can be worked around in a much more efficient way. Defaults to the value of create_graph .
  • create_graph ( bool , optional ) – If True , graph of the derivative will be constructed, allowing to compute higher order derivative products. Defaults to False .

So if you set retain_graph=False this looks promising.

Do you mean to cal autograd.grad function again this time with retain_graph=False??? This seems nasty!

1 Like

If you check the docs, when you do create_graph=True this means retain_graph will take this True, since it defaults to the value of create_graph.

If you do create_graph=True, you should set also retain_graph=False to release the memory.

Hi,

I don’t think this is what he wants.
retain_graph can be used if you want to call backward() multiple times on the same graph.
create_graph is used if you want to backward() through the backward(). But since this second backward() will go through the current graph, then retain_graph has to be set as well. If you set it to False, you will get an error.

@kolorado the graph is deleted as soon as all references to it are gone. This would be the Tensor that contains the loss for example.
Note that pytorch uses a custom gpu memory allocator. So when objects are freed, they are not returned to the OS directly. And so you won’t see in nvidia-smi that objects have been freed. This memory is available to be reused by pytorch though. So the next forward will actually not allocate more memory on your GPU. You can find the functions here in the doc to be able to see exactly how much memory is used and how much is cached to be re-used later.

2 Likes

True but if in some case he doesn’t plan to backward() multiple times on the same graph, which I think is most likely the case it would be great to do:

backward(create_graph=True, retain_graph=False, ...)

to free the graph.

As per this post.

Hi,

But if you create the graph, you’re planning on calling backward on it no?
Basically the existing graph will be a subset of the new graph created by create_graph=True. So if you ever want to use this newly created graph, you must pass retain_graph=True when you create it.

Or am I misunderstanding your point?

1 Like