After I called loss.backward() for a network, I want to calculate the gradient with another loss for a specific layer. The simplest way is to call x2=x.detach(), then call forward and backward again.
If I only use retain_graph=True, then the gradients of each parameter in the graph will be computed. However, I only want to calculate the grad of one specific layer (e.g., Conv layer d in the example code).
In my view, for a given forward pass, you can do multiple backward passes by using retain_graph = True. If this is your scenario, it should work I guess.