Loss in Self supervised learning with Simclr

Hi friends,

I am using pytorch implementation of SIMCLR, training it on my own dataset.

The problem is, after 100 epoch, the loss dropped from 5.6 to 5.0, and it cease to decrease anymore. I wonder what might be the problem of it, or is it a

The learning rate i set is is 0.2 and I wrap the optimizer with LARS with eeta=0.001, and the batch size is 512. (around 200000 small image with resnet18)

As you guys could see, the learning rate is rather moderate, right? (not unappropriately large). So what do think might be the problem~


or is it a normal situation~