How do I know my custom loss is working if it is part of a hybrid loss?

I am using a model such that the loss is a linear combination of three losses, two from PyTorch and one customized. Simply said:

custom_loss =my_loss(out3-t3)
loss = torch.nn.L1Loss(out1-t1) + torch.nn.L1Loss(out2-t2)  + custom_loss

How do I know for sure that the custom_loss is working as intended? I have set everything in my_loss using requires_grad. Now, when I test custom_loss after Line-1 it gives True, as follows:

print(custom_loss.requires_grad)
True

Also, when I print it, I get:

print(custom_loss)
tensor(0.6144, device='cuda:0', grad_fn=<divBackward0>)

So, I think everything is alright. However, I am still not sure if my_loss is used in the backward pass. That is, if the weights’ update is only based on the two other losses.

You could try to plot the custom_loss during training or plot all the three parts of loss separately and then observe the whole loss and the separate ones. I think this is a proper way to check whether your custom loss works.

I already know that the custom loss is changing, as I am watching the three losses.
This is a GAN model, which is highly unstable. Hence, I am still not sure if the custom loss is changing due to the other two losses, as they are taking part in the backpropagation.

Another way around this, but still not sure about it, is to set the other two losses to zero, and see if the custom loss changes.

What do you think about this approach?

It seems to be viable, but if you turn off the first two losses and only train with the custom_loss, I am not sure if there is intra-influence between the first two loss and the last one. You could try it and adopt a suitable way.

1 Like

Multiply the other losses with zero, and then look at the gradients of your model, if it is changing, and if the gradients have a significant magnitude, everything should be working as well as it could be.

1 Like

I have used multiple losses that way (using a sum of the losses) with success. The loss.backward() call computes the gradients for all weights according to the result of the sum, and the gradients get back propagated with the optimizer later. So your loss is used in the backward pass.

One way to check if it “works” (the loss value decreases if it should decrease) is to plot it separately, as @MariosOreo said.

One other way to check if it “works” (helps in your current objective) is to try without your custom loss (remove it from the sum) and then compare the results (overall loss and accuracy or results if applicable).

The problem here, I could be wrong of course, is that the model might learn from the other losses, hence, the custom loss will still decrease/change. Thus, to overcome this, we might need to zerofy the other losses, prior to the backward pass, and only keep the custom loss and plot it. If the loss curve does not change, this means the custom loss is not used to update the model. This is because no gradients are generated to update the weights with backprop. Still as an alternative, and as noted by @LeviViana, we can zerofy the other losses and take a look at the gradients of the model.

True, I didn’t think about that aspect.

Then you can try two things, try training with only the first two losses, and with only your custom loss (as well as with all three) and track the progress. There’s nothing like actually trying it to verify if it works or not :slight_smile:

1 Like