Confusion regarding requires_grad for ensuring that particular networks are not updated?

I have four networks: F, C, G and D. I have one optimizer for F and C, one optimizer for G and one optimizer for D and this is in a GAN like setup.

I require the D optimizer to optimize only the discriminator (D), the G optimizer to optimize only the generator (G) and the last optimizer to optimize only (F and C).

The images go through F and gives an output.

I detach the same output and pass it through G and it should reconstruct the images. Then the output of G is detached and passed through D.

The same output of F passes through C to do a regression task and has a regression loss.

I first calculate the adversarial loss for the discriminator and do lossd.backward().
Then I do the adversarial loss for the generator.
Finally I calculate the regression loss and do regressionloss.backward().

At the beginning of the iteration, i make require_grad=True for all networks. When I backpropagate a particular network i make require_grad = True for that network and False for all others.

Would this be logically correct? Also, is there maybe a better way to do this and am I being too redundant?