For some reasons, I use retain_graph = True and hook to get the gradient while backward, but this will lead to the gpu memory leak because the computation graph is not released. so how can i free graph manually?
1 Like
If I’m not mistaken, the graph should be cleared once all attached tensors are deleted.
Since Python uses function scoping, you could wrap your calls in a function and the computation graph should be freed once you leave the function scope (if you don’t return the output tensor and keep it alive of course).
2 Likes
yes, some inner variables have not been released while using hook, there is no memory leak after delete all of them.