Loss function for multi-label classification with missing values

I am training a network to identify topics in text. Text can contain several topics. In some of the texts, I have missing labels for some of the topics (I do not have information if the text contains or not contains the topic ).

I use MultiLabelSoftMarginLoss() as loss function.

I try to indicate the missing values with a special label and get bad results.

Do you have a recommendation for another loss function or a reference to a custom function that someone wrote? before I am trying to invent the wheel myself and write a custom function?

Thanks,
Ortal

I don’t know, if there is a good implementation of this approach for this loss function, but instead of using a new “missing” label, you could also ignore the missing label.
To do so, you could calculate the unreduced loss first with any label (including a missing label) via reduction='none' while creating the loss function. Afterwards you could multiply this unreduced loss with a mask to set the missing losses to zero, and reduce it e.g. via mean().

Hi. I am doing exactly this for my application: calculating the unreduced loss and applying a mask to zero-out some of the values. The values I want to mask depend on the set of grount-truth labels, meaning that the masked values are different in each sample of a batch. Is calling .mean() on the masked tensor or should I somehow ignore the zero values when calculating the mean? I tried doing this

https://discuss.pytorch.org/t/use-tensor-mean-method-but-ignore-0-values/60170

but got worse results. Thanks!