Learning Curves and behaviour

I trained Alexnet on ecg dataset which is imbalance. My learning curves are these. 10 epochs, 0.001 Lr and Adam is optimizer. Batch size is 16.


It also skips some classes as here

Where did I go wrong?
I also wanna know what impact batch size has and lr on this. I dont understand this weird learning curve.

I trained vgg16 and got these

Your model seems to overfit the majority class and the accuracy might be misleading due to the Accuracy Paradox so you might want to check the metric to plot.
To counter overfitting in an imbalanced class setup you could use a weighted loss or could use a WeightedRandomSampler to oversample the minority classes.

1 Like

Alright I see this article
Handling Imbalanced Classes with Weighted Loss in PyTorch | NaadiSpeaks I
but how do that class_weights are calculated? It says 1- (#samples in class) which class??? like is it for the minimum sampled class??? my dataset is like this

You could define the weights via weight = 1.0 / count where count represents the number of samples for each class.

1 Like

Hey @ptrblck I m facing an issue while doing transfer learning on ECG dataset using alexnet architecture. My classification report came like this:

Why the classes have 0 precision and recall. I dont understand.

The metrics would indicate a zero value e.g. if your model never predicts these classes.
Based on the results your model is only predicting the two classes with the most support.

1 Like

so you mean i should balance the whole dataset to equal samples and check? or is there any other possibility.
I previously had F ‘0’ class with 161 samples i augmented so now 322 but still its imbalance but previously I have seen my model without augmentation predicted the last class as well and skips ist 4 and 5th. but after augmnetation which is still imblance my model skips all except 2 and 3 class.
Moreover if i plot the classification report it shows different results like here.
The without augmentation