Backpropogate loss for two seprate neural network in one function

I have defined two seprate network now.
def common():
Optim1= optim.adam(network1.params)
Optim2= optim.adam(network2.params)
Optim1.zero_grad()
Optim2.zero_grad()
Loss1.backward()
Loss2.backward()
Optim1.step()
Optim2.step()
Here to get run above function I need to do retain graph variable true…

So here when I do Loss1.backward() at that time Loss1 is getting back propagated in both neural network or only in first neural network?

you only need to do it for Loss1.backward() and not the Loss2.backward(). if have another loss3 that you wish to do after Loss 2, you need to do it for loss2 as well

Thanks but didn’t understand question…read last line…my question is not about use of retain graph= True it about backpropagating loss…is it backpropogating in both neural network or in one neural network?

Only the first one.

Note, that you shouldn’t use the same tensor is used in both NN (shared input or shared weight) unless you know what you’re doing.

Thanks I got it…I have define two separate neural network in separate class so I thought I don’t need to keep retain_graph=True as both are different neural network class. Loss1 and Loss2 is calculated using output of both neural network and I want to separately back propagate Loss1 to first neural network and Loss2 to second neural network. Is it OK what I did in question? WIthout retain_graph =True it throws the error
"

"

SO I was forced to keep retain_graph=True…It should work without it right?