How to give more importance to one class

I have five classes to predict and would like one of my models to prioritize class 3, how could I code that?

Is this what you mean?

I think so, how does it work exactly? I just read about it.

Since I have five classes and I want to prioritize class 3 would it be something like:

num_sample = 5
weight = [0.5, 0.5, 0.5, 0.9, 0.5]
sampler = torch.utils.data.sampler.WeightedRandomSampler(weight, batch_size)
trainLoader = torch.utils.data.DataLoader(trainData, num_sample , shuffle=False, sampler=sampler)

Also found this comment “It looks like weights is a list of weights per data point in the data set we are drawing from, NOT a weight per class”

Yes, WeightedRandomSampler expects a weight value for each sample in the dataset.
Here is a complete example.

Alternatively, you could also use a class weight and pass it to your criterion, if it fits your use case.
E.g. nn.CrossEntropyLoss accepts a weight tensor, which assigns a specific weight to each class.

How about for nn.BCEWithLogitsLoss or for nn.BCELoss? How can we pass the weights? I mean I know I will be passing weight=weight but the weight tensor should contain only one element and how could we do that?

I would generally recommend to use nn.BCEWithLogitsLoss instead of nn.BCELoss for increased numerical stability and the availability of the pos_weight argument.

Note that both loss funcitons are used for a binary of multi-label classification/segmentation, so I’m not sure, if it’ll fit your use case.

That being said, you could specify the pos_weight argument as described in the docs to balance a class imbalance in your dataset. You could also use the weight argument to rescale each sample in the batch. In the latter case, I would personally use the functional API, as you would probably recreate the criterion in each iteration based on the current batch.

1 Like

Okay, going back to the multi-class issue. Say I’ve 3 classes each having following the number of labels

Class 1 : 3711
Class 2 : 7107
Class 3 : 2942

I was wondering if there’s any other way to assign class weights rather than just inversion like this

weight = 1.0 / torch.Tensor([3711,7107,2942], dtype = torch.float)
criterion = nn.CrossEntropyLoss(weight=weight)

And also do we have to follow any intuitive rule like all the weights have to equal to something?

Thanks Patrick, what does the tensor consists of? 5 values between zero and one?

The weights need not necessarily be values between 0 and 1.
Follow this link https://scikit-learn.org/stable/modules/generated/sklearn.utils.class_weight.compute_class_weight.html