How to specify the updated parameters for centain loss

Hi,
I have one application that loss_A should only update says the first layer. And loss_B will update the whole network. loss_A and loss_B should be trained together (means first layer will updated by loss_A and loss_B at the same time). How to specify the parameters I want to update for a certain loss?Thank you.

I don’t know if there is a function which update different layers with different losses, but i think you can try like this :

# First, update first layer
for i, param in enumerate(model.parameters()):
    if i != 0:
        param.requires_grad = False
    else:
        param.requires_grad = True
loss_A.backward()

# Then, update other layers
for i, param in enumerate(model.parameters()):
    if i == 0:
        param.requires_grad = False
    else:
        param.requires_grad = True
loss_B.backward()