Looking through the documentation, I was not able to find the standard binary classification hinge loss function, like the one defined on wikipedia page:

l(y) = max( 0, 1 - t*y) where t E {-1, 1}

Is this loss implemented?

Looking through the documentation, I was not able to find the standard binary classification hinge loss function, like the one defined on wikipedia page:

l(y) = max( 0, 1 - t*y) where t E {-1, 1}

Is this loss implemented?

Like for doing a MCSVM. I’m not sure was looking for that the other day myself too but didn’t see one. Let me know if you find please. Was gonna do a more thorough check later but would save me the time😁

They have the MultiMarginLoss and MultilabelMarginLoss. But the one in particular you looking for is MarginRankingLoss and suits your needs

4 Likes

Did you find the implementation of this loss in Pytorch? Although i think it should be easier to implement this

May be you could do something like this

```
class MyHingeLoss(torch.nn.Module):
def __init__(self):
super(MyHingeLoss, self).__init__()
def forward(self, output, target):
hinge_loss = 1 - torch.mul(output, target)
hinge_loss[hinge_loss < 0] = 0
return hinge_loss
```

3 Likes