What is the difference between multiple classification binary classification in pytorch code?

I have already done some Text classification tasks with pytorch.

But when I review my code ,I found that they are basically the same except the loss function(CrossEntropyLoss, BCEWithLogitsLoss)

In my opinion, at least the activation function should be different(softmax ,sigmoid)

So who can tell me the difference between multiple classification and binary classification in pytorch code?

nn.CrossEntropyLoss is used for multi-class classification use cases, i.e. each sample belongs to only one class (out of multiple classes). nn.BCEWithLogitsLoss is used for a binary classification use case, i.e. each sample belongs to the positive or negative class (or something in between would even be possible), or for multi-label classification use cases, i.e. each samples belongs to zero, one, or multiple classes.

1 Like