BCELoss with positive weights

Hi, I’m doing multi-label binary classification (8 classes), and have a model whose outputs have already passed through the sigmoid function (thus, I don’t want to use BCELossWithLogits). How do I incorporate positive label weighting for each of the 8 classes I have? Thanks!

pos_weight is currently implemented in nn.BCEWithLogitsLoss, so you would have to either reimplement BCELoss with these weight arguments or switch to BCEWithLogitsLoss.
I would recommend to second approach, as it will also have more numerical stability.

1 Like

To change to BCEWithLogitsLoss, would I simply get rid of my final Sigmoid layer, and during training, pass those outputs to the loss function? In my validation, I’ll have to call the sigmoid function on my outputs before i compare predicted labels and ground truths, right? Am I missing anything?

Yes, just remove the sigmoid and pass those outputs directly to BCEWithLogitsLoss. Well, u don’t really have to call sigmoid again in Validation because for sigmoid, when x=0, y=0.5, u can use 0 as an indicator.

1 Like