How to pass weights parameter inside nn.BCEWithLogitsLoss

Let’s say i have to train an image classifier an a highly unbalanced dataset. Then I would like to penalize the losses belonging to the dominating classes less and vice versa !

Can you pls show with a few lines of code how exactly weights in nn.BCEWithLogitsLossis passed ?

Imagine we have a dataset in which we have three classes with the following number of examples:
classA: 900
classB: 90
classC: 10
now how would you define ur loss function and how would you pass the weights argument?

would it be like
loss_fn = nn.CrossEntropyLoss(weight = [1- 900/1000, 1- 90/1000, 1 - 10/1000]) ???

loss_fn = nn.CrossEntropyLoss(weight = [1- 900/1000, 1- 90/1000, 1 - 10/1000])

This will work

1 Like

Hello Gurgaon!

You are correct that the named weight argument is the way to
provide class weights to nn.CrossEntropyLoss.

But the weights you use in your example don’t make a lot of
sense to me. The weights you give for classes B and C are
not very different – they are 0.91 and 0.99, respectively, even
though you have nine times as many class-B samples as
class-C samples.

If you want your weights to fully compensate for the unbalanced
number of samples per class, you would want relative weights of:

    weight = [ 1/900, 1/90, 1/10 ]

Furthermore , if you want your weights to have – on average – an
overall scale of 1, you would want:

    weight = (1000 / 3) * [ 1/900, 1/90, 1/10 ]

(Also, could you edit the title and body of your post to change
“nn.BCEWithLogitsLoss” to “nn.CrossEntropyLoss” where it
occurs.)

Best regards.

K. Frank

1 Like