Is it possible to define an optimizer's parameters, like learning rate in a list before creating the optimizer?

I’ve tried:

        optim_parameters = {
            'lr': params.learning_rate, 

        optimizer = optim.Adam([input], optim_parameters) 

Instead of:

        optimizer = optim.Adam([input], lr = learning_rate) 

But I get this error:

  File "/usr/local/lib/python2.7/dist-packages/torch/optim/", line 104, in step
    step_size = group['lr'] * math.sqrt(bias_correction2) / bias_correction1
TypeError: unsupported operand type(s) for *: 'dict' and 'float'

You should unpack your dict like this:

optimizer = optim.Adam([input], **optim_parameters)