MultiLabelMarginLoss definition

loss(x, y) = sum_ij(max(0, 1 - (x[y[j]] - x[i]))) / x.size(0)
Here is the definieition given by Pytorch for its multilabel margin loss, "where i == 0 to x.size(0), j == 0 to y.size(0), y[j] != 0, and i != y[j] for all i and j."
It doesn’t make sense to me at all since hinge loss should be something like L(y) = Sum_ij max(0, 1-x[j]*y[j]). Could anyone clarify this?
Also I would really appreciate if anyone could find the source code of this loss function.
Thanks!

Are you trying to do Multi Label Classification? If so, use BCELoss, not Multi label margin loss. I will be uploading a tutorial on the same soon, as a lot of people are having problems with multi label.

Hi! I wonder if you uploaded the tutorial on multi-label classification? I could not find it. Thanks :slight_smile:

Hi, I thought some one else did so I didn’t do it. Can you try looking in this forum and see if you find it? If not I can write one whenever I get some time (probably next month).

Thanks.
I think there is no tutorial on this. But I do find some discussions on multi-label classification.
Could you please give an explanation that why should we use BCELoss rather than Multi label margin loss?

Isn’t BCELoss for binary classification only?

It could be used in non-binary classification when targets are so-called one-hot vectors. The term binary comes from its two-term definition.