I am trying to train Enet(https://arxiv.org/abs/1606.02147) on the CamVid dataset using the Lovasz Softmax Loss. Originally Enet uses the following class weight strategy:
def get_class_weights(loader, num_classes, c=1.02):
_, y= next(iter(loader))
y_flat = y.flatten()
each_class = np.bincount(y_flat, minlength=num_classes)
p_class = each_class / len(y_flat)
return 1 / (np.log(c + p_class))
I am applying this class strategy using:
class_weights = get_class_weights(train_loader, 12)
criterion = CrossEntropyLoss(
weight=torch.FloatTensor(class_weights).to(device)
)
How do I apply class weights to a custom loss function such as Lovasz Softmax which doesn’t accept the parameter weight
?