Is it possible to define an optimizer's parameters, like learning rate in a list before creating the optimizer?

I’ve tried:

        optim_parameters = {
            'lr': params.learning_rate, 
        }

        optimizer = optim.Adam([input], optim_parameters) 

Instead of:

        optimizer = optim.Adam([input], lr = learning_rate) 

But I get this error:

    optimizer.step(closure)
  File "/usr/local/lib/python2.7/dist-packages/torch/optim/adam.py", line 104, in step
    step_size = group['lr'] * math.sqrt(bias_correction2) / bias_correction1
TypeError: unsupported operand type(s) for *: 'dict' and 'float'

You should unpack your dict like this:

optimizer = optim.Adam([input], **optim_parameters)