(As an aside, your title for this thread, “Cross entropy with logit
targets” should probably be “Cross entropy with probability targets,”
since you ask about “probability scores” and use probabilities in the
example you give.)
Your ground-truth target probabilities are what are sometimes called
“soft labels.” Pytorch’s CrossEntropyLoss does not support this kind
of target, but you can write a version that does. For details, please
see this post: