Is there an example for multi class multilabel classification in Pytorch?

The link points to a legacy version of the loss.
This is the current implementation in the master branch.
The main difference is, that the loss will be averaged over the feature dimension:

loss = loss.sum(dim=1) / input.size(1)  # only return N loss values

Here is an older post, which compared both losses, which won’t work anymore due to the shape mismatch.

Here is the updated version:

x = torch.randn(10, 3)
y = torch.FloatTensor(10, 3).random_(2)

# double the loss for class 1
class_weight = torch.FloatTensor([1.0, 2.0, 1.0])
# double the loss for last sample
element_weight = torch.FloatTensor([1.0]*9 + [2.0]).view(-1, 1)
element_weight = element_weight.repeat(1, 3)

bce_criterion = nn.BCEWithLogitsLoss(weight=None, reduction='none')
multi_criterion = nn.MultiLabelSoftMarginLoss(weight=None, reduction='none')

bce_criterion_class = nn.BCEWithLogitsLoss(weight=class_weight, reduction='none')
multi_criterion_class = nn.MultiLabelSoftMarginLoss(weight=class_weight, reduction='none')

bce_criterion_element = nn.BCEWithLogitsLoss(weight=element_weight, reduction='none')
multi_criterion_element = nn.MultiLabelSoftMarginLoss(weight=element_weight, reduction='none')

bce_loss = bce_criterion(x, y)
multi_loss = multi_criterion(x, y)

bce_loss_class = bce_criterion_class(x, y)
multi_loss_class = multi_criterion_class(x, y)

bce_loss_element = bce_criterion_element(x, y)
multi_loss_element = multi_criterion_element(x, y)

print(torch.allclose(bce_loss.mean(1), multi_loss))
> True
print(torch.allclose(bce_loss_class.mean(1), multi_loss_class))
> True
print(torch.allclose(bce_loss_element.mean(1), multi_loss_element))
> True

Yes, and I think it could be still an issue, as logsigmoid is mathematically more stable than log + sigmoid, since internally the LogSumExp trick will be applied as seen here.

7 Likes