Dataloader with multitask loss?

I am trying to implement minibatch using torch.utils.data.DataLoader. I have a multi-task loss where each loss term requires an input of different size. More specifically, I am constructing a Physics-informed neural network.

For example, for loss_1, I have 5000 samples, and for loss_2 I have 50 samples. I have both these samples in form of npz files.

I have implemented my training epochs without the minibatch as follows:

for i in range(max_iter):
    loss_1 = loss_A (dataset_1, output_1)  # minimises the third derivative of output using torch.auograd.grad()
    loss_2 = loss_B (dataset_2, output_2)  # standard MSE loss
    total_loss = loss_1 +  loss_2
    # minimise the total loss

How do I implement this with a minibatch using the torch.utils.data.DataLoader? In minibatch, you backpropagate the loss for each batch as follows:

for i in range(max_epoch):
    training_loader = torch.utils.data.DataLoader(training_set, batch_size=4, shuffle=True)
    for i, data in enumerate(training_loader):
        inputs, labels = data
        loss_1 = loss_A (inputs, output_1) 
        # minimise the total loss in each batch

I want to access minibatches for two different datasets simultaneously. I hope my problem is clear. I know there are solutions, just wanted to ask if someone has some trick.