When I get 0 loss, is it always the indication of overfitting? Because I am getting accuracy around %50 on even train set, although my loss is 0 after a while.

What could be the problem in this case? Could you please help me if you have any idea? Thanks a lot

You don’t provide any detail on what you are doing, so it’s hard to
be certain, but you very likely have a bug somewhere.

Common loss functions only go to zero when the predictions are
“completely right.” Such predictions should – on your training
set – produce perfect accuracy. So something isn’t lining up here.

Thanks a lot @KFrank, I would expect the same thing on training set therefore I asked. This loss function corresponds to standart triplet loss which corresponds to minimising energy between positive and anchor and maximising between anchor and negative.

I am evaluating the accuracy on the same set that I used for training. Is there any chance that the problem is related to batching somehow?

Let me take back what I said. Triplet loss is not a loss function for
which zero loss means “completely right.” (In fact, it’s not even
clear to me what “completely right” would mean in practice in a
case where one would use triplet loss.)