Strange loss increase when restart training

Could be your learning rate? If it decays based on the number of iterations, make sure that when training restarts the number of iterations is correct.

1 Like