Improving model's performance

I am using a pre-trained mobilenetv2 model running experiments for 30 epochs with the learning rate being adjusted by 0.1 every 10 epochs and these hyperparameters:

learning rate = 0.001
weight decay = 4e-5
momentum = 0.9
batch size = 64
optimizer = SGD
dropout = 0.2

I am using class_weights since my training dataset is imbalanced, but my test dataset is perfectly balanced. The formula for determining each weight is minority class size/class size. I am getting good results as my confusion report is showing that each class is generating accuracies in the 80s and 90s. The issue I am facing is that after a while my model will start to fluctuate, for example in my latest experiment after 11 epochs the accuracy keep bouncing between 87% and 91%.

What I have tried so far:

doubling the weight of the minority class, so the model can learn it as well as the others.

changing the learning rate: either 0.01 or 0.0001

changing how much the learning rate is adjusted

oversampling w/ meanginful augmentation might work better than weighting the loss

1 Like

I tried your method and I am not seeing any difference, aside from the fact that the detection accuracy of my minor class went maxes out in the 70% range, where as when I use of class weights i would high 80% to high 90%. I used the exact same setup as the one I posted in my question. It’s stuck between 87% and 89% validation accuracy. This is the data augmentation method I am using.

train_transform = transforms.Compose([
transforms.RandomResizedCrop(224,scale = (0.2,1.0)),
transforms.RandomRotation(15), # rotate +/- 10 degrees
transforms.RandomHorizontalFlip(), # reverse 50% of images
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406],
[0.229, 0.224, 0.225])
])