Multi-label binary classification: result type Float can't be cast to the desired output type Long

Doing multi-label binary classification with BCEWithLogitsLoss, but I get a RunTimeError “RuntimeError: result type Float can’t be cast to the desired output type Long”. Running 1.8.1 on Windows but the problem also occurred on 1.4. Am I doing something wrong with respect to multi-labels, perhaps?

from torch import FloatTensor, LongTensor
from torch.nn import BCEWithLogitsLoss


criterion = BCEWithLogitsLoss()

logits = FloatTensor([[ 0.1144, -0.3313, -0.1003],
                      [-0.0207, -0.2222, -0.0905],
                      [-0.1814, -0.0793, -0.1249],
                      [ 0.0079, -0.3403,  0.0057]])
labels = LongTensor([[0, 1, 0],
                     [0, 0, 1],
                     [0, 0, 1],
                     [0, 1, 1]])

loss = criterion(logits, labels)
9 Likes

Seems like the solution is to explicitly cast labels.float() although that seems quite counter-intuitive for class labels.

7 Likes

Hi Bram!

Two comments:

First, as you’ve seen, BCEWithLogitsLoss requires its target to
be a float tensor, not long (or a double tensor, if the input is
double). And yes, converting to float (labels.float()) is the
correct solution.

Second, as to why: Unlike pytorch’s CrossEntropyLoss,
BCEWithLogitsLoss supports labels that are probabilities (sometimes
called “soft” labels). Thus, a label could be 0.333. This would indicate
that the sample has a 33.3% of being in the “1”-class (or “yes”-class)
and therefore a 66.7% change of being in the “0”-class (“no”-class).
So this is probably a “no”, but a value of 0.00 would be a “hard”
(non-probabilistic, fully-certain) “no.”

(Note that probabilistic labels make perfect sense for cross entropy,
as well. It’s just that pytorch’s CrossEntropyLoss chooses not to
support them (although perhaps it should). Doing so would require
CrossEntropyLoss to accept a target with a different shape, namely,
with a class dimension.)

Best.

K. Frank

9 Likes