Reduce the penalty for misclassifying few classes

I’m a beginner in Computer Vision, Pytorch and working on the multi class classification problem ( MNIST), how do I adjust the penalty for misclassifying, for example, a 5 as a 6, because a human would do similar mistake as they look pretty close ( for a terrible image), or 1 as 7 in few images. The weight parameter in loss function is not really suitable for this, as it’s for imbalanced datasets and I’m not dealing with imbalanced datasets. Do I need to write few if conditions in the custom loss functions or is there a built in way to define less penalty for classifying it wrong between few classes.

Hi Abhilash!

The most straightforward approach would be to use cross entropy as
your loss function, but to use probabilistic ("soft’) labels, rather than
categorical (“hard”) labels.

Pytorch does not have a soft-label version of cross-entropy loss built
in, but it is easy to implement one. See this post:

This post explains how to use soft labels to reduce the penalty for certain


K. Frank