Cross entropy with logit targets

I am training a model for classification where each ground truth has some uncertainty and is thus a vector of probability scores e.g. [0.1, 0, 0.7, 0, 0.2, 0, 0] instead of [0,0,1,0,0,0,0]. The cross-entropy loss in PyTorch however, accepts only an integer target so I was hoping if someone could recommend a solution or an alternative loss function that is suitable for my classification problem.

https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html

2 Likes

Hi Haziq!

(As an aside, your title for this thread, “Cross entropy with logit
targets” should probably be “Cross entropy with probability targets,”
since you ask about “probability scores” and use probabilities in the
example you give.)

Your ground-truth target probabilities are what are sometimes called
“soft labels.” Pytorch’s CrossEntropyLoss does not support this kind
of target, but you can write a version that does. For details, please
see this post:

Best.

K. Frank

1 Like

Update: from version 1.10, Pytorch supports class probability targets in CrossEntropyLoss, so you can now simply use:

criterion = torch.nn.CrossEntropyLoss()
loss = criterion(x, y)

where x is the input, y is the target. When y has the same shape as x, it’s gonna be treated as class probabilities. Note that x is expected to contain raw, unnormalized scores for each class, while y is expected to contain probabilities for each class (typically the output of the softmax layer). You can find details in the docs.