How to generalize softmax to allow more than one true class

My input is a list of 5 embeddings and I want to predict whether the 5 entities are selected. If I only need to output one entity, I can use softmax activation. Then what if two of them are selected?

Basically I want a generalized softmax and the sum of the output vector is 2 instead of 1. Also each element of the output is less than or equal to 1.

It sounds like you are describing a multilabel objective. One possible loss function for something like this would be multilabel softmarginloss: MultiLabelSoftMarginLoss — PyTorch 1.9.0 documentation

Yes, it is a multi-class objective but I am not sure it that is also multilabel. From my understanding multi-lable means different fields, a valid output can have the elements being all ones or all zeros, but in my problem, the sum of sample output vector y must be 2, i.e. only two of the labels are 1.

In this case I wonder if simply computing the product of the labels (2**5 = 32 classes) would work.

I can use 5C2 = 10 classes, but it is quite hard to scale up using this method. Thanks anyways!

1 Like