Loss in self-supervised training

I am training a self-supervised learning model. The training loss is moving within a range of values, even after training for 50+ epochs. And not decreasing after a certain point.

I want the model to overfit. I have set dropout to 0. Will try playing with the learning rate. What else can be done?

Training loss currently

xyz