Does binary cross entropy with logits = 0.5 equals random guess?

Hi, I have 256 samples labeled with 1 and 256 samples labeled with 0. My loss seems to converge to 0.51

Does it mean, the model only makes a random guess?

To be precise I have

domain_loss = F.binary_cross_entropy_with_logits(domain_predictions, domain_y)

and the printout converges to 0.51

Not exactly, as with two classes the loss should be about ~0.693 for “random” guessing where the model is outputting ~0.5 for each sample. In general we would expect the model at initialization to yield a loss of ln(num_classes), which is also why e.g., models trained on ImageNet typically start at a loss of about ln(1000) or 6.9.

For example:

>>> output = (torch.ones(10000) * 0.5)
>>> target = (torch.rand(10000) > 0.5)*1.0
>>> torch.nn.BCELoss()(output, target)
tensor(0.6931)

True “confident” random guessing would yield something a little different e.g.,

>>> output = (torch.rand(10000) > 0.5)*1.0
>>> target = (torch.rand(10000) > 0.5)*1.0
>>> torch.nn.BCELoss()(output, target)
tensor(50.4500)