Weights in weighted loss (nn.CrossEntropyLoss)

Hello,

I do not know what you mean by reverser order, but I think it is better if you normalize the weights proportionnally to the reverse of the initial weights (so the more examples you have in the training data, the smaller the weight you have in the loss). Here is what I would do:

    weights = torch.tensor([9.8, 68.0, 5.3, 3.5, 10.8, 1.1, 1.4], dtype=torch.float32)
    weights = weights / weights.sum()
    print(weights)
    weights = 1.0 / weights
    weights = weights / weights.sum()
    print(weights)
3 Likes