What are the parameters involved to calculate the gradient during backpropagation?

Refer this

No need to use retain_graph=True if that runs error free for you.
However, I’m anticipating it’ll not. retain_graph=True shall be required here as when you call lossG.backward() without it, you are basically trying to backward through the graph a second time after the saved tensors have already been freed which will produce an error.

1 Like