For multi-task, mutple instances required for loss?

I am doing a kind of multi-task learning. I need 2 cross etropy losses for targets and prediction logits for two different tasks A and B.

Then in this case, a safe way would be

criterionA = torch.nn.CrossEtnropyLoss()
criterionB = torch.nn.CrossEtnropyLoss()

loss = criterionA(z_A, t_A) + criterionB(z_B, t_B)

But I am wondering if it is necessary to create the loss instances two times. Would it be okay to simply create a single instance and reuse it as in below?

criterion_shared = torch.nn.CrossEtnropyLoss()

loss = criterion_shared(z_A, t_A) + criterion_shared(z_B, t_B)

Also, I would be very grateful which custom is preferrable in general for an arbitrary loss (not specifically cross entropy).

I assume there is a specific reason why pytorch dev. created the loss function as a class.

Many thanks!