Giving multiple parameters in optimizer


(Yunjey) #1

How can i give multiple parameters to the optimizer?

fc1 = nn.Linear(784, 500)
fc2 = nn.Linear(500, 10)
optimizer = torch.optim.SGD([fc1.parameters(), fc2.parameters()], lr=0.01)  # This causes an error.

In this case, for simplicity, i don’t want to use a class with nn.Module.


How to set parameter list for optimizer for a set of linked sub-networks
How to optimize multiple fully connected layers?
#2

you have to concatenate python lists:

params = list(fc1.parameters()) + list(fc2.parameters())

torch.optim.SGD(params, lr=0.01)

How to train 2 network in sequential order simoultaniously using a single loss funtion in the end?
(Yunjey) #3

Thanks, it works well.


(Cipher) #4

dear smth
you really know a lot
thx for help all along


(Mr Positron) #5

Dear Soumith,

While executing your approach, it says:

TypeError: add() received an invalid combination of arguments - got (list), but expected one of:

  • (Tensor other, Number alpha)
  • (Number other, Number alpha)

Can you help me?)

Is there something wrong?


(Justus Schock) #6

Probably you set a bracket to the wrong place. You have to convert the parameters to a list separately and add the lists afterwards.


(Mr Positron) #7

[SOLVED]

params = self.net.state_dict()
pas = list(params['net.0.weight']) + list(params['net.0.bias']) + list(params['net.3.weight'] + list(params['net.3.bias'])  + list(params['net.6.weight']) + list(params['net.6.bias']))
self.optimizer1 = optim.Adam(pas, lr = 0.01)

Here is my code. I think everything is ok