Hello, I am attempting classification loss results. The examples in the documentation only show only random value inputs so it’s hard to gain perspective on the results I’m seeing. I would expect these results apart from probably the first nll_loss to be 0 since I am using the same tensor as an input expect encoded as one hot, yet I’m seeing 0.5514 for what should be 0 loss?
I am probably missing something obvious but in the past I’ve only dealt with regression tasks and these classification result just seem wrong? I’ve been trying to look this up but from all searches this should work as I expect yet it does not?
a = torch.tensor([0, 1, 2]) b = F.one_hot(a).float() F.cross_entropy(b, a) # returns 0.5514 F.nll_loss(b, a) # returns -1 F.nll_loss(F.log_softmax(b), a) # returns 0.5514 nn.BCELoss()(b, b) # returns 0