Saving gradients after loss.backward()?

I’m attempting to save the gradients of some parameters with respect to my loss function. I want to take the values of the gradients, and use these values in another parameter in my network. However, it appears that these are not being retained (even with the retain_graph flag set to True). Is there any straightforward way to do this?

Hi,

retain_graph has nothing to do with gradients.
Could you give a small code sample that illustrate what you want to do please?