Using specific loss for an optimizer

When we have multiple optimizers and multiple losses, is it possible to specify the which loss to backward() to? Something like a wrapper:

with opt_1 as optimizer:
    loss.backward()

losses backward depending on the graph.
Graph is done at a tensor level. It is totally different from optimizers which just update weights.
Therefore it’s not really clear what are you trying to do.

I see. I considered just freezing part of the model and using one loss for that part, and a second loss for another part. But my concern is that after I backward the first loss, since the parameter has already been changed, will that affect the second backward call?