BCEWithLogitsLoss and Class Weights

I am trying to apply Class Weights to BCEWithLogitsLoss. I felt I had read every post on this and was well prepared to do so but apparently not.

I have two classes.

[ 0.50897302 28.36130137]
criterion = nn.BCEWithLogitsLoss(pos_weight=torch.tensor(class_weights, device=device))

Each batch is 136 items.

I get:

RuntimeError: output with shape [136, 1] doesn't match the broadcast shape [136, 2]

I am not sure what I am doing wrong

1 Like

Hello bfeeny!

Because your output has shape [136, 1], I conclude that your
network has a single output, and you are performing a single-label
(in contrast to multi-label) binary classification problem.

Even though you technically have two classes (the “0” or “no”
class and the “1” or “yes” class), the pos_weight constructor
argument of BCEWithLogitsLoss only takes a single weight,
namely that for the “1” (“positive”) class.

So you want something like:

criterion = nn.BCEWithLogitsLoss(pos_weight=torch.FloatTensor ([28.36 / 0.5090], device=device))

where I’ve chosen to pass in as pos_weight the relative weight
given by what I presume are your weights for class “0” and
class “1”.


K. Frank

1 Like

Thanks for your reply. I have much more of my 0 class and not much of my 1 class. So you are saying I should pass in just the weight of my 1 class. Would that be 28.36 / 0.5090?

0.5090 is the “weight” of my 0 class and 28.36 is the weight of my 1 class.

Hello bfeeny!

Yes, I think that is what I am saying.

But, to be concrete, let’s take an extreme example:

Let’s say that your entire training set consists of 1,000 samples,
990 class-“0” samples, and 10 class-“1” samples. If you want
to reweight the classes in your loss function to fully account for
this imbalance, you would use:

criterion = nn.BCEWithLogitsLoss (pos_weight = torch.FloatTensor ([99.0]))


K. Frank


Hi, how does this solution apply if I have 3 and more classes?