Fine-tune RoBert

Hello, I’ve encountered the same problem, I’m training the robert model using Trainer from the Transformers library. The first times everything was fine, the model was trained quite obviously and as it should, but after restarting the kernel, some kind of nonsense began, f1 from the first episode tends to 0 loss is constantly growing