Can't force class weights

I have a class imbalance where class are split like this:
0: 5.2%
1: 6.51%
2: 76.7%
3: 6.38%
4: 5.13%

I’m computing class weights with

from sklearn.utils.class_weight import compute_class_weight
class_weights = compute_class_weight(class_weight='balanced', classes=np.unique(ys), y=ys)
# produces : class_weights = [3.8412954, 3.06959094, 0.26054039, 3.13274541, 3.8984915]
...

loss_function = nn.CrossEntropyLoss(weight=class_weights)
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)

However I still see that the model have bias towards predicting class 2 and not really learning anything other class.

Any advice how to address this?

Did you increase the weights of the minority classes to “force” the model to learn them or did you only use the sklearn computed weights?

I just used the sklearn computed weights.

Did you increase the weights of the minority classes to “force” the model to learn them

I thought that by using the weights computed by sklearn I was doing just that.

Hey @ptrblck
I was wondering if you had some other suggestions here.
Thanks in advance.

Increase the weights of the minority classes until your model is overfitting these to make sure your code does not have any other issues.