BCELosswithLogits no binary assertion for gt

Hi,
If I’m not wrong BCE with logits used to have some assertion for the ground truth to be binary right?
Any reason why it’s been removed? (pytorch 1.2.0) It doesn’t even check if it’s bounded between 0-1

import torch

crit = torch.nn.BCEWithLogitsLoss()

pred = (torch.rand(10) - 0.5) * 10
gt = torch.randint_like(pred,0,2).float()
assert (pred > 1.).any()
assert (pred < 1.).any()

dif_predgt = crit(pred,gt)
dif_gtpred = crit(gt,pred)
print(dif_gtpred)
print(dif_predgt)

Hi Juan!

First, I wouldn’t want such an assertion.*

Second, I happen to be running pytorch 0.3.0, and I don’t see such
an assertion.

Here is your script, tweaked to run on 0.3.0:

import torch
print (torch.__version__)

crit = torch.nn.BCEWithLogitsLoss()

pred = (torch.rand (10) - 0.5) * 10
gt = torch.round (torch.rand (10))
assert (pred > 1.).any()
assert (pred < 1.).any()

dif_predgt = crit (pred, gt)
dif_gtpred = crit (gt, pred)
print (dif_gtpred)
print (dif_predgt)

And here is the output:

>>> import torch
>>> print (torch.__version__)
0.3.0b0+591e73e
>>>
>>> crit = torch.nn.BCEWithLogitsLoss()
>>>
>>> pred = (torch.rand (10) - 0.5) * 10
>>> gt = torch.round (torch.rand (10))
>>> assert (pred > 1.).any()
>>> assert (pred < 1.).any()
>>>
>>> dif_predgt = crit (pred, gt)
>>> dif_gtpred = crit (gt, pred)
>>> print (dif_gtpred)
1.4237342596054077
>>> print (dif_predgt)
1.8770477950572968

*) It is perfectly reasonable and meaningful for the targets (“ground
truth”) of binary cross-entropy to non-binary. Your annotated training
data, could, for example, be scored with how likely the sample is to
be in class “0” or class “1”. I wouldn’t want pytorch to gratuitously
prevent me from using binary cross-entropy with such a use case.

You could argue that pytorch should check that the targets are bounded
between 0 and 1. This could make sense to me, but I would probably
argue that such a constraint is the responsibility of the upstream code
and that BCELossWithLogits (and related) shouldn’t have to pay
the minor performance penalty of imposing it.

Good luck.

K. Frank

You are right. I was just mentioning it because I think several losses assert inputs. Just to check the assertion wants missing in some version :slight_smile: