Validation loss goes high with validation accuracy

Hi everyone, I met this situation and get the confusion. I have some questions to ask for your help?

  1. The validation loss is increasing when validation accuracy is increasing, too. So, the best model should be the lowest loss or highest accuracy?

  2. If we choose the highest accuracy as the best model, then if we look at the losses, easy to see the overfitting scenarios (low training loss and high validation loss).

  3. If we choose the lowest loss as the best model. Why sometimes the loss is high with the accuracy, it seems like have true predict but low confidence (aka. probability, I use cross-entropy loss). Then, is the high confidence is always good?

Hi,

From little experience I have, I tend to take the best model as the one with the highest accuracy, if there is no overfitting (not your case).

Yes, you are right.

Yes, a model with high confidence in its predictions is preferable. It is also why the loss is often a better metric to look at when you want to determine if the model is overfitting (e.g. the accuracy does not differentiate from a prediction at probability 0.5 or 0.9, both can predict the correct class, but your loss does).

In your situation, I would try to modify the architecture of the model to get rid of the overfitting that you have.

Good luck!

1 Like

@ptrblck Sorry for reviving an old topic. I am training a model which has a high validation accuracy, but the training and the validation losses are high which show low confidence in the predictions. I wonder if there is a way to get a lower training loss as I am using Adam with the learning rate lr=0.003.