my model has two sets of parameters a
and b
which require gradients. I have two different loss functions as:
loss1 = function(a, b)
loss2 = function(b)
and the total loss is
total_loss = loss1 + loss2
Assuming:

loss1
calculatesb1.grad

loss2
calculatesb2.grad
,
then the totalb.grad
calculated bytotal_loss.backward()
isb.grad = b1.grad + b2.grad
.
My goal is to modify b1.grad
coming from loss1
and b2.grad
comig from loss2
and then add them together as backward gradients for b
. Currently when I do total_loss.bakward()
, it already gives me the accumulated gradients for b.grad=b1.grad + b2.grad
. How can I access and modify each individual b1.grad
and b2.grad
so the total_loss.backward()
returns the modified and added b1.grad
and b2.grad
?