Adam optimizer with jit::script::Module

Dear all,
I have successfully imported a pre-trained model in libtorch using the torch::jit::load function:

torch::jit::script::Module model = torch::jit::load('');

Now I want to perform fine-tuning at runtime using the Adam optimizer. I have tried to instantiate the optimizer with:

torch::optim::Adam optimizer(model.parameters(), torch::optim::AdamOptions(/*lr=*/0.001));

However, I get error since parameters() returns an object of type torch::jit::parameter_list instead of std::vector<Tensor> params. How can I perform such conversion or solve the problem?

Thanks in advance

I have the same problem. Did you find a solution by any chance?

Something like this seems to work:

    auto model = torch::jit::load(model_path);
    torch::jit::parameter_list params = model.parameters();
    std::vector<at::Tensor> paramsVector;
    std::transform(params.begin(), params.end(), std::back_inserter(paramsVector),
                   [](const torch::jit::IValue& ivalue) { return ivalue.toTensor(); });
    torch::optim::Adam optimizer(paramsVector, torch::optim::AdamOptions(2e-4));