Loss suddenly increases using Adam optimizer

Yes, I use a combination of gradient clipping and batch normalization which has pretty much ensured that this never occurs again.