How to set weights for 38 number of classes individually in Cross Entropy Loss, via pytorch

Hi to Everyone
I have 38 number of multiple classes, which is in turn 37 colourful pixels and 1 black pixel. Then, my target is load more weights for colourful pixels,but less weight for black pixel.Then, concern is how to do this?

Thansk in advance

Hello Fikrat!

I answered this exact question in my reply to your previous post:

If this doesn’t work for you or your have further questions, it would
probably make the most sense to move the discussion back to that
earlier thread.


K. Frank

Dear Frank,

Thanks for your kind request and valuable time shared with me. But I want to know the principle of how to set wieght for each individual color pixel in which I am really confused.
In order to be much more clear, I provided constructor as I did:

class SegmentationLosses(object):
def init(self, weight=None, size_average=True, batch_average=True, ignore_index=255, cuda=False):
self.ignore_index = ignore_index
self.weight = weight
self.size_average = size_average
self.batch_average = batch_average
self.cuda = cuda

Hi Fikrat!

I’m not entirely sure what you are asking.

But let’s say you have 38 classes represented by (labelled by) colored
pixels – class 0 = black, class 1 = color-1, class 2 = color-2, and so on.

You would count – or estimate the count of – each kind of pixel in your
dataset, for example:
count[0] = 1000, count[1] = 10, ..., count[37] = 10

This would be an unbalanced dataset, that is, their are 100 times as
many black pixels (examples of class 0) as pixels of any other single
color (examples of any other single class).

You would then set weight = 1.0 / count and pass it as the weight
argument to torch.nn.CrossEntropyLoss when you instantiate it.
That is, you reduce the weight (by a factor of 1 / 100) of class 0, for
which you have many examples, relative to the other 37 classes, for
which, individually, you have significantly fewer examples.

(weight, in this example, is a tensor of shape [38], where 38 is the
number of classes you have.)

This looks like you are defining your own loss-function class,

There is no need to do this. If I understand your use case you should
simply use pytorch’s built-in torch.nn.CrossEntropyLoss, as it
already supports reweighting an unbalanced dataset by using its
constructor’s weight argument, as outlined above.

Good luck!

K. Frank

Dear Mr. Frank

Thanks for your kind support and valuable time