How to avoid model not detecting a class in some of the epochs

I have trained a Vgg16 model. I am able to train the model, while testing the model I found that one class is not getting recognized in some epochs, but in other epochs it is detecting all the classes. (which I came to know by printing the output labels using np.unique(y_predict))

I printed the classification report and accuracy using sklearn.metrics import classification_report,confusion_matrix, which resulted the following warning only on those epochs which it did not detect one of the classes, other classed it did not result this waring as it detected all the classes. (I know the waring is appearing as it was caused by zero dvision error)
UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use zero_division parameter to control this behavior.

I am able to detect all the classes in the dataset using some other model in all the epochs, but Vgg16 is not detecting one of the classes, in some epochs, detecting all the classes in some epcohs.

My request to how to solve this problem, i.e. how to make the model detect all the classes in all the epochs? I there a way like using the softmax at the end of the layer or any other solution?