CNN gives negative output

Hello,

I have trained my model using CrossEntropyLoss thus I havent used a softmax at the end or a sigmoid (it is binary classification but I have used softmax).
However, when I test the model, it gives negative output (probability) and I assume this is wrong.
Am I supposed to have a softmax at the end of the model during testing? If yes how can I tune it then?

Thank you

Since you are using nn.CrossEntropyLoss the model is supposed to return logits, which contains arbitrary values. The negative values are thus not wrong but expected.
To get the predicted class you could directly use preds = torch.argmax(output, dim=1) on the raw logits. To see the probability, you could apply softmax on the output, but do not use these probabilities in the loss function.
Also, note that torch.argmax returns the same predicted classes for logits or probabilities, since the max. value index won’t change.

1 Like