I am using F.log_softmax
+nn.NLLLoss and after about 150 epochs both my training and validation are flat around nearly 0
Normally with cross entropy i’d expect the validation curve to go back up (over-fitting) but that is not happening here.
If i understand this set up correctly, the higher the correct prediction confidence is, the closer to 0 it gets?
Does this mean reaching zero is ideal?
Does the fact that my model is trending close to zero mean I should stop training?
How do I explain the validation curve not going back up (away from zero)
Data set is:
9 classes, balanced representation, ~49,000 samples
I did look before I posted, couldn’t find the answer