Is it possible for the Cross-Entropy Train loss to have a higher value than my Train Error?

I have a model that after 60 epochs the train error is larger than the train loss, is that normal:

epoch=59, train_loss_epoch=0.03931455929022359, train_error_epoch=0.9796516262755102, test_loss_epoch=1.836074846982956,test_error_epoch=0.8577952665441175

I didn’t think it was possible cuz I thought Cross Entropy (CE) was a upper bound on the error (0-1 loss). I think they call it convex relaxation. If this is true then shouldn’t the train error I am observing impossible? (I think it should also hold for the test loss vs test train).

Cross entropy loss is a negative log likelihood loss. So it’s not really in [0-1] range…

Agreed, but its always positive and above the error, no?

yeah it’s always positive, but not necessarily above the error. For example, consider a 2-way classification problem, where the softmax output is always [0.5-eps, 0.5+eps], but the true label is always 0. Then the cross entropy loss is log 2 < 1, but the error is 1.