I decreased the learning rate by a factor of 10 and my model reaches 90% + accuracy, is this normal?

Hello,
I am using transfer learnig on vgg11 and train my model on dataset A and testing on dataset B and C to perform binary classification.
Initially, the learnign rate was 0.001 and I used a scheduler every 2 epochs the accuracy doesnt decrease the accuracy would drop to 1/10th of the current learnign rate.

When I used this method, the accuracy of the model was stryggling to go voer 50% thought the learning rate was decreasing.

Then I initiated the learning rate at 0.0001 and the model accuracy after the first epoch is 90% and it keeps increasing.

Some more infromation about my training loop:

  1. My batch sie is 16
    2)I update the weights every batch size
  2. At the end of each epoch I evaluate the model by testing it on the other datasets and the mean accuracy is 85% ± 2% and mean loss varies from 0.3 to 1.1

Thank you

Yes, the described behavior could be normal if your learning rate was too high for an already pretrained model. Note that the original training routine might have already reduced the learning rate and increasing it to 1e-3 (still in the original training script) might have shown the same effect of catapulting the parameters out of the trained state.

1 Like