Loss suddenly bumps and becomes const

Hi,
I am training a cnn that outputs an image.
My loss goes down well, and then it jumps and keep ± constant.
Did someone saw this phenomenom?
It happens on the first or second epoch.
I tried to use Adam , SGD, and also changed the activaitions with no luck.
The only thing that helps is to learn less conv filters.
Someone has an idea?
Thanks,
Nitsan

I’ve seen similar behavior in the past, where a “bad update step” produced large gradients and pushed the model parameters out of the “nice” loss landscape.
I think lowering the learning rate helped, but I would also assume that e.g. Adam should reduce this issue.
Anyway, you could check the gradients for the update steps and compare their magnitude, in case you want to debug your model.

Hi,
Thanks for the replay
I just want to update that using AMSGrad seems to resolve the issue.
Thanks again.