Correctlly assign weights BinaryClassification BCEWithLogitsLoss

Hey

Can I ask if this is the correct way to assigned weights for unbalanced dataset?

my class weights are
{0: 1.94, 1: 0.67}

criterion = torch.nn.BCEWithLogitsLoss(pos_weight = torch.FloatTensor([1.94 / 0.67]).to(device))

above is correct?

This post might be helpful.

1 Like

num_negative / num positives

Patric question:

according to your post it is:

num_neg / num_positive

so if:

0: 10000 1:20000

it will be

10000/20000

but if

0: 20000 1:10000

20000/10000?

is this correct? my understanding

Yes, the docs are also giving the example:

For example, if a dataset contains 100 positive and 300 negative examples of a single class, then pos_weight for the class should be equal to 300/100=3. The loss would act as if the dataset contains 3×100=300 positive examples.