Difference between Cross-Entropy Loss or Log Likelihood Loss?

Hi, I have wrote a little post about KL divergence, Cross-Entropy and Negative Log-Likelihood Loss a few weeks ago: https://medium.com/@stepanulyanin/notes-on-deep-learning-theory-part-1-data-generating-process-31fdda2c8941. I hope you can find a few answers there too.

7 Likes