Hello Matheus!
I won’t comment on whether KL-divergence or cross-entropy is
likely to be the better loss function for your use case.
But (assuming I understand what you are asking), no, you can’t
use pytorch’s built-in CrossEntropyLoss
with probabilities for
targets (sometimes called soft labels, a term I don’t much like).
It requires integer class labels (even though cross-entropy makes
perfect sense for targets that are probabilities).
However, you can write your own without much difficulty (or loss
of performance). See this post:
Best.
K. Frank