Setting one optimizer for a list of Models

I have a few models defined in list:

G = [ SomeResNet().cuda() for i in range(no_models)]

I need to define the Adam optimizer for all these models, ie G[0], G[1],...
Is the following correct?

x=[]
for i in range(no_models):
    x= itertools.chain(x, G[i].parameters())

optimizer = torch.optim.Adam(
    x, lr= 0.01, betas=(0.8, 0.9)
)

If you want to get the parameters of all modules, you could also use an nn.ModuleList and just pass the .parameters() of this ModuleList to the optimizer.
Using this approach would also make sure to push all submodules to the CPU/GPU if needed.

Great idea! Thank you @ptrblck; this worked for me:

optimizer = torch.optim.Adam( nn.ModuleList(G).parameters(), lr= 0.01, betas=(0.8, 0.9) )