Soft Labeling Cross Entropy Loss in PyTorch

What is the easiest way to implement cross entropy loss with soft labeling? for example, we give the label 0.1 and 0.9 instead of 0/1.

You can softening your target label and then feed into CrossEntropyLoss.

No, we cannot do that. CrossEntropyLoss function need to be feed into target tensors within torch.LongTensor.

1 Like

You are right. Implemented cross entropy manually is probably the easiest thing to do. If you want quick backward you can even write an autograd.Function.

Won’t we need to take care about numerical stability while using log_softmax…???