How to get gradient of one cost function out of two?

what I do is

totalLoss = g_cost +  G_mseCost
totalLoss.backward()

and then I take the gradient of the total loss
for name, f in G.named_parameters():

before I apply update
g_optimizer.step()

in this case the gradient I’m having is the total gradient. is there a way to get the gradient of one out of them by not doing two backward and if it is the only way how can I do the two backward without inferring again?

1 Like
G_mseCost.backward(retain_graph=True)

@chenyuntc so you are suggesting to use two backwards ?

To do two backwards passes you would need to pass retain_graph=True to the first one.
e.g.

G_mseCost.backward(retain_graph=True)
g_cost.backward()

I can’t see any way to only do one backwards pass.

2 Likes

In this case I should do
optimizer.step()
and it will take the two gradients in the update or I should add them first into new loss?

Each time you do something.backward() the gradients are added to the gradients stored in param.grad for each parameter.

So you can do

loss1.backward(retain_graph=True)
loss2.backward()
optimizer.step() # uses combined gradients from both losses
2 Likes