Update tqdm loop for two models

Hello Everyone,
I am training two models together, each with a separate loss function and optimizer… if I update tqdm loop for the first model do I need to update it for the second as well? Or I should only update it after the second model… the sample code looks like this,

Model_1 Training Loop

with torch.cuda.amp.autocast():
out1 = model1(img)
loss1 = loss_fn1(out1, gt1)
# backward
optimizer1.zero_grad()
scaler.scale(loss1).backward()
scaler.step(optimizer1)
scaler.update()
# update tqdm loop
loop.set_postfix(loss=loss.item()) #### tqdm loop is updated here for first model (1)
train_losses1.append(loss.item())

Model_2 Training Loop

with torch.cuda.amp.autocast():
out2 = model2(img_batch)
loss2 = loss_fn1(out2, gt_batch)
# backward
optimizer2.zero_grad()
scaler.scale(loss2).backward()
scaler.step(optimizer2)
scaler.update()
loop.set_postfix(loss=loss2 .item()) ### tqdm loop is updated here for second model (2)
train_losses2.append(loss2 .item())

Should I remove the (2) line of code, or is this a valid practice?
Thanks
Cheers