Loss explodes is equal to overfitting?

I imagine you are using Cross-Entropy loss somewhere. You could try to balance the class importance on the loss by setting different weights. Maybe this thread could help a bit.