CrossEntropyLoss vs BCEWithLogitsLoss in multi-label classification

I am doing multi-label classification, when using CrossEntropyLoss I get this error:
RuntimeError: 1D target tensor expected, multi-target not supported
and when change it to BCEWithLogitsLoss I get this error:
ValueError: Target size (torch.Size([1, 1])) must be the same as input size (torch.Size([1, 18]))

The label indices look as follow for the first 10 examples:
[[0, 1], [0], [0], [0, 1], [0, 2], [0, 1], [0], [0, 1], [0], [0]]

I have 18 classes. Would you please let me know what I am doing wrong.

Thank you

Assuming Iā€™m remembering correctly, BCEWithLogitsLoss is expecting an all-ones encoding rather than a sparse representation, i.e. the target labels should be length 18 with each value representing one of the classes.

1 Like