Document typo in the CrossEntropyLoss documentation page

Hi guys, I found a typo in the CrossEntropyLoss documentation website.
https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html?highlight=cross%20entropy%20loss
the denominator component of the 3rd formula in the page should read Sum of exponents as opposed to Exponents of the sum. This is to match with the SoftMax.
I verified it with some code, happy to provide it.
Regards,
Javier

Thanks for letting us know! Would you be interested in providing the fix for it? :slight_smile:

Hi ptrblck,

You are a legend in the forum and the mere fact that you are replying to me is the coolest thing this year!

I’m happy to provide a fix, the question is what’s inaccurate, the formula in the documents or the code?

If the problem is in the formula, then it’s a simple fix. But if it’s in the code, would you send me a link to the code where the Cross entropy is calculated, please?

Kind regards,

Javier

Ha, thanks for the kind words!
I’ve took a quick look at the formula and think that the docs are wrong as it seems the log_softmax formula (log((exp(x_i))/sum_j(exp(x_j)))) mixed up the exp and sum.
The docs are defined here and you can check the used formula here, which is indeed using log_softmax (I hope I’m not missing something obvious :stuck_out_tongue: ).

Thanks ptrblck,

You are on the money! I agree with you, I believe the formula has a typo in the docs.

Do you want me to edit the docs or you guys do that?

Kind regards,

Javier

If you are interested in working on the PR, please go for it!

Just did! :slight_smile:

let me know if there is a problem with it.

Thanks! :slight_smile: