When is retain_graph=False and create_graph=True useful?

I thought such a combination should throw an error. I can’t imagine why anyone would want to compute higher derivates but then forget the graph. Wouldn’t the screw up the computation of higher derivatives by freeing the graph to early?

Why doesn’t that throw an error?

https://pytorch.org/docs/stable/autograd.html

Docs:

  • retain_graph ( bool , optional ) – If False , the graph used to compute the grad will be freed. Note that in nearly all cases setting this option to True is not needed and often can be worked around in a much more efficient way. Defaults to the value of create_graph .
  • create_graph ( bool , optional ) – If True , graph of the derivative will be constructed, allowing to compute higher order derivative products. Defaults to False .

Hi,

I don’t think there is any practical reasons except debugging where you want to inspect the graph but don’t need to actually run backward on it.

yea that combination disallows to compute even higher order info since the graph is freed up…right? If we kept calling torch.autograd.grad for some reason

what is a debugging example?

The one I have in mind is if you use things like torchviz that need the graph to exist but don’t need to backward through it.
But yes it is a weird use case. :smiley: