I am trying to find the equivalent of sigmoid_cross_entropy_with_logits loss in Pytorch but the closest thing I can find is the MultiLabelSoftMarginLoss.
Can someone direct me to the equivalent loss? If it doesn’t exist, that information would be useful as well so I can submit a suitable PR.
Maybe the answer to this stackoverflow question is helpful,
In mathematical terms, what exactly do you want to do? That might be easier for people to help you with, rather than trying to port over a TF function?
If you want to do multi-label classification, so do I, but I haven’t figured out yet how to do it in PyTorch? So I’m also interested in your question
From the implementation details, it would seem that the MultiLabelSoftMarginLoss is indeed the equivalent of the sigmoid_cross_entropy_with_logits loss. Closing this!
Just for anyone else who finds this from Google (as I did), BCEWithLogitsLoss now does the equivalent of sigmoid_cross_entropy_with_logits from TensorFlow. It is a numerically stable sigmoid followed by a cross entropy combination.
Worth noting that KLDivLoss still needs to run with reduction='batchmean' – to get the “soft cross_entropy” behavior that people are asking. Surprised this isn’t a more clearly documented…