Not sure exactly if this is overfitting. Validation loss didn’t increase by that much. You should run this for longer and see if the validation loss increases further.
One way to reduce overfitting in transfer learning is to freeze the initial layers and then train your network. In the case of ResNet, you can freeze the conv1, conv2, and conv3 layers and see if that helps.
I see. In that case, you can try increasing augmentation (ColorDistortion, Solarize, ColorJitter, GaussianFilter etc.) and introduce Dropout in the layers.
Got better results after switching to SGD from Adam Optimizer. But validation accuracy and loss saturate early .
Learning curves (before early stopping)