How to use multiple parameters for optimizer

in pytorch it is possible to do:

params = list(fc1.parameters()) + list(fc2.parameters())
torch.optim.SGD(params, lr=0.01)

How can I do this with c++ API?
Im not sure im doint it right like this:

 auto encoder = std::make_shared<net::Encoder>(3);
   auto decoder = std::make_shared<net::Decoder>(3);

   auto params = encoder->parameters(true);
   auto decParams = decoder->parameters(true);
   params.insert(params.begin(), decParams.begin(), decParams.end());
   auto adamOpts = torch::optim::AdamOptions(0.1);
   torch::optim::Adam optimizer(params, adamOpts);

Looks like this method works!
If anyone stumbles upon this, just concacenate network parameter vectors