Resuming training with different learning rate of the optimizer in libtorch

Hello everyone.

I am using libtorch-cxx11-abi-shared-with-deps-2.0.1+cpu

Suppose i trained a model with SGD and saved the model and the optimizer. Now i want to resume my training with the same optimizer but this time with different learning rate (say 0.002). Can i do that?

Another thing to clarify: while resuming the training we usually load the model and the optimizer from some saved checkpoints. But suppose this time i only load the model, and not the optimizers, and continue training what will be the problem?