Changinig Optimizer

Hi,
I have trained a model using Adam optimizer for 30 epochs. Now I want to the load saved model and train using SGD optimizer. Will it decrease my loss value? Is it Okay?

It’s fine. In fact it’s a “typical” way of training. Notice the optimal learning rate is different for both optimizers and you have no prior knowledge about which one will be adequate for training with SGD. Therefore you will have to do some trials until you find the proper one.