How to calculate the gradients of intermediate layer w.r.t another intermediate layer?

Hi, everyone! I have a problem with calculating gradients of intermediate layer. In PyTorch, using backward() and register_hook() can only calculate the gradients of target layers w.r.t final output or loss. But according to chain rule, these gradients equal (gradients of target layers w.r.t one intermediate layer) * (gradients of this intermediate layer w.r.t final loss). How can I calculate these gradients of target layers w.r.t another intermediate layer?

If I understand you correctly, you want to calculate the gradients of a particular layer or upto a particular layer.

While i do not undertand the reason or benefit behind it, you can probably do that by calculating the loss directly from that particular layer, or make a subset of layers to calculate the gradient.

call targetlayer.backward(retain_graph = True) , before the call and after your get the gradient, both time you have to set graph gradient to be zero.