Impede a layer which requires grad to be impacted by a given loss

Consider the following sequential model:
Net A -> Net B --> Net C

Also consider Loss B, computed with the output of Net B, and Loss C, with the output of Net C.

I would like Loss C to impact the weights of Net A, B and C, and Loss B to only impact Net B.

If I add the two losses and run loss.backward(), Loss B will impact Net A. If I set .requires_grad= False for Net A, then it won’t be impacted by Loss C.

Could anyone help me achieve the behavior I want?

You could create two different forward passes through NetB with the output of NetA and with the detached output of NetA:

outA = NetA(input)
outB = NetB(outA)
outB_det = NetB(outA.detach())
outC = NetC(outB)

loss = criterion(outC, target)
loss.backward() # will calculate the grads for A, B, C

lossB = criterion(outB_det, taret)
lossB.backward() # will add grads to B

Let me know, if this would work for you.