How to use the sum output of the two models to update the two models

I try to train a two-stram network. I have two models, and I hope I can use the sum output of the two models to update the two models.

Now I have two models, get their loss respectively: they are loss1 and loss2.
And then I add the two loss values: loss = loss1 + loss2.
The question is that:
loss.backward() will update wich models’ gradient? The last one or both of them?

The code is as follows:

spat_out = spat_model(spat_data)
temp_out = temp_model(temp_data)

spat_loss = spat_criterion(spat_out, labels)
temp_loss = temp_criterion(temp_out, labels)

loss = spat_loss + temp_loss
loss.backward()

spat_optimizer.step()
temp_optimizer.step()

Am I using the correct method?

The loss.backward() call will let Autograd calculate the gradients for both models, as the final loss is a sum of both partial losses.
You can also verify it by printing the gradients of some parameters of these models after the backward call.