Final Model Training Problem - Overfitting

Hello. I am working on a CNN project for multiclass classification. I implemented hyperparameter optimization to find the most suitable model, during which I got a best accuracy of 97.38%. I then took this model and applied Early Stopping to it, but the results were much worse. To start, the accuracy dropped greatly. Second, the model seems to be overfitting the data, as shown in the learning curve below. Test loss stops dropping after the shown point, and the training loss goes down to almost zero. I have tried implementing some techniques like Dropout, weight decay, and using different batch sizes, but the issue still persists: the train loss goes down rapidly and converges quickly while the test loss saturates at a specific point and does not yield good prediction results. What are some other steps which I can implement to solve this issue. Can I fix the model during Early Stopping straight away, or do I have to rerun the hyperparameter optimization from the beginning? Also, what is the reason that the same model performs exceptionally well in K-Fold, but not in Early Stopping?

enter image description here