Query about CrossEntropy loss

hi friends
In case of multiclass classification where the classes will be ranging from 0-N .
How does pytorchs CrossEntropyLoss handles integers value of classes…
Does it converts them into 0 and 1 vectors

sum(0:N) (Sum(0:k) (-ylog(p(y) ))

is y an integer here or 00001000 vector .

I am pretty sure it does that implicitly, because while one-hot encoding is nice on paper (mathematically efficiently to express) it must be very inefficient for a computer to do it explicitly. Either way, it does exactly what you’d expect if you’d carry out the one-hot encoding manually (quick demo here: https://github.com/rasbt/stat479-deep-learning-ss19/blob/master/L08_logistic/code/cross-entropy-pytorch.ipynb)

1 Like

Thanks Raschka
Have they changed it to work this way

is it one and same thing…
in what way loss varries in case of right or incorrect predictions…