After 2nd iteration: always same result when training in a loop #8569

I train a BERT model on a binary classification task. I do the training 4 times in a row. With same train and validation data and with the exact same hyperparameters. I use no seeds.

The results of the 2nd, 3rd and 4th iteration are 100% the same. The result of the 1st run is unique. This behavior is 100% reproducible.

Due to the random initialization I would expect all runs to have different results. As with the 1st vs other runs. The resons why the 2nd and all following runs are equal must be a bug.

Did anybody alse see a problem like this? Am I doing anything wrong or is there in deed a major bug going on?

Full bug description here: