Giving multiple parameters in optimizer

How can i give multiple parameters to the optimizer?

fc1 = nn.Linear(784, 500)
fc2 = nn.Linear(500, 10)
optimizer = torch.optim.SGD([fc1.parameters(), fc2.parameters()], lr=0.01)  # This causes an error.

In this case, for simplicity, i don’t want to use a class with nn.Module.

4 Likes

you have to concatenate python lists:

params = list(fc1.parameters()) + list(fc2.parameters())

torch.optim.SGD(params, lr=0.01)
73 Likes

Thanks, it works well.

dear smth
you really know a lot
thx for help all along

3 Likes

Dear Soumith,

While executing your approach, it says:

TypeError: add() received an invalid combination of arguments - got (list), but expected one of:

  • (Tensor other, Number alpha)
  • (Number other, Number alpha)

Can you help me?)

Is there something wrong?

Probably you set a bracket to the wrong place. You have to convert the parameters to a list separately and add the lists afterwards.

[SOLVED]

params = self.net.state_dict()
pas = list(params['net.0.weight']) + list(params['net.0.bias']) + list(params['net.3.weight'] + list(params['net.3.bias'])  + list(params['net.6.weight']) + list(params['net.6.bias']))
self.optimizer1 = optim.Adam(pas, lr = 0.01)

Here is my code. I think everything is ok

Since parameters() actually returns a iteration, itertools.chain() looks a better approach:

import itertools

params = [fc1.parameters(), fc2.parameters()]

torch.optim.SGD(itertools.chain(*params), lr=0.01)
12 Likes

How is this different from just putting all of the tensors in a list directly as OP did?

If your models are in a list or tuple somewhere already, you can also use a nested list comprehension:

models = [nn.Linear(784, 500),
          nn.Linear(500, 10)
          ]
optimizer = torch.optim.SGD((par for model in models for par in model.parameters()),
                             lr=0.01)

OP is passing a list of lists.

Thank you! This helped me a lot!