Hello, How to combine different two losses

first, sigmoid -> BCE loss

second, softmax -> multilabel classification cross-entropy

but, parameter sharing situation

sorry, I am not good at writing English.

multi label learning very well, but binary learning not good.

loss = loss1 + loss2

Should I give weight to loss?

I would appreciate your reply.

I assume you have two different outputs in your model, i.e. one using the nn.BCELoss and the other for the nn.CrossEntropyLoss?
Now one part of your model learn quite good, while the other gets stuck?
A weighting of these losses might be a good idea.
Could you compare the ranges of both losses and try to rescale them to a similar range?

Also, as a small side note, if you are using nn.CrossEntropyLoss for classification, you should pass the logits to this criterion, not the probabilities using nn.Softmax.

2 Likes

Thank you. Your answer was a great help