When do we use retain_graph in torch.autograd.backward?

retain_graph (bool, optional) – If False, the graph used to compute the grad will be freed. Note that in nearly all cases setting this option to True is not needed and often can be worked around in a much more efficient way. Defaults to the value of create_graph.

I know that we don’t really need it in most cases, But i still want to know what cases would we need retain_graph to be true?