Loss function backward problem

Suppose I have a multi-task learning problems, I have a shared CNN module and there are 4 different small network behind the module, the losses for each network are L1, L2,L3,L4, so the total loss are L = L1+ L2+L3+L4, how to update 4 different network parameters? L.backward()? or L1.backward(),L2.backward() separately??

The correct way is to use:
L.backward()

Yes , I think too complicated, L1 + L2 + L3 = L. is a part of compute graph