How to consider weights in cross entropy loss

Hi
There is a weight option in nn.CrossEntropyLoss class. What I don’t understand is whether I should calculate weights for the whole dataset and then feed it to the loss function, or I must calculate it separately for every batch?!

Thanks

In theory you would only set it once according to how imbalanced the classes in your training dataset are. This is also implied as it is a parameter you can pass at initialization time, rather than during invocation of the forward method.

1 Like