How to modify the gradient manually?

Thanks.

And if i’ve two loss function and different operations to do?

For example, i’ve f_1 and f_2, and I want the learning signal (or the gradient to apply) to be D(f_1) + D(f_2) * C, where D(f_1) stand for derivative.

It should be:

loss2.backward()
for p in model.parameters():
p.grad *= C
loss1.backward()
optimizer.step()

This works because i compute the gradients in f_2 before than f_1, otherwise grad accumulate gradients and i can’t acces only f_2 part. There’s a way to do it regardless of order? So is there a way to access the gradient before it is accumulated in p.grad?