Adam optimizer with jit::script::Module

Dear all,
I have successfully imported a pre-trained model in libtorch using the torch::jit::load function:

torch::jit::script::Module model = torch::jit::load('mymodel.pt');

Now I want to perform fine-tuning at runtime using the Adam optimizer. I have tried to instantiate the optimizer with:

torch::optim::Adam optimizer(model.parameters(), torch::optim::AdamOptions(/*lr=*/0.001));

However, I get error since parameters() returns an object of type torch::jit::parameter_list instead of std::vector<Tensor> params. How can I perform such conversion or solve the problem?

Thanks in advance