Different loss for a part of graph

Hi guys,

I have an Encoder(E) and a generator(G). One of the layers (Z) in b/w is stochastic.
The loss for both the modules is different. How can I update their weights? Do you think retain_graph=True is relevant here?

Do you think the following approach will work:

  1. Have two optimizers one each for E and G.
  2. Compute Z and Loss_G. Use Loss_G.backward() to set G.grad. Here as of yet, I do not update the G parameters.
  3. Set the E.grad = 0 using E_optimizer.zero_grad(). Use E_Loss.backward() to get correct E.grad
  4. Then update both E and G weights simultaneously.

In step 3, how can I ensure that while doing E_loss.backward(), G.grad is not touched? Can we exclude nodes like that while doing a particular backprop?

Thanks

you can call .detach() on some variables to ensure G.grad is not touched when E_loss.backward() is called.

See the GAN example for instance: