How to implement multiple loss function at diffent layer

Does torch.autograd.back([l1,l2]) mean this two loss backward for separate node? such as softmaxloss updates fc2 and layers before fc2, custom_loss updates fc1 and layers before fc1?
And if I use L = l1 + l2, L.backward(),
is L will update fc2 and fc1 separately?