Will gradient flow through layer when set its 'require_gradient' to False

Suppose I have a network G which outputs a map out_G, then I feeds this out_G to a network D which output some out_D. If I set all the D’s layers’ require_gradient to False and use out_D to compute loss. I know the parameters in network D will not update, but will the parameters in network G update?

1 Like

Yes.

Best regards

Thomas