Loss value 0 indication

Hi,

When I get 0 loss, is it always the indication of overfitting? Because I am getting accuracy around %50 on even train set, although my loss is 0 after a while.

What could be the problem in this case? Could you please help me if you have any idea? Thanks a lot

Hi Lyca!

You don’t provide any detail on what you are doing, so it’s hard to
be certain, but you very likely have a bug somewhere.

Common loss functions only go to zero when the predictions are
“completely right.” Such predictions should – on your training
set – produce perfect accuracy. So something isn’t lining up here.

Best.

K. Frank

Thanks a lot @KFrank, I would expect the same thing on training set therefore I asked. This loss function corresponds to standart triplet loss which corresponds to minimising energy between positive and anchor and maximising between anchor and negative.

I am evaluating the accuracy on the same set that I used for training. Is there any chance that the problem is related to batching somehow?

Best

Hi Lyca!

Let me take back what I said. Triplet loss is not a loss function for
which zero loss means “completely right.” (In fact, it’s not even
clear to me what “completely right” would mean in practice in a
case where one would use triplet loss.)

Best.

K. Frank

What is the reason for that? I wonder if we get 0 loss function, I would expect to overfit to training data. Is this statement wrong?