Updating learning rate during training

hi,
I was wondering how I can update the learning rate in the Adam::optim optimizer after a few interactions of training in libtorch? In the python version there is lr_scheduler, but it seems like this is not implemented yet in the libtorch?

Is there a way to set the learning rate manually after each epoch?
Here is how I initialize my optimizer:

torch::optim::Adam optimizer(model->parameters(), torch::optim::AdamOptions(learning_rate).weight_decay(weight_decay))