SGD OPTIMIZER initialization

Hi! I want to set a optimizer with this parameters:

base_lr: 1e-2
momentum: 0.9
weight_decay: 0.0001
lr_policy: "step"
display: 100
max_iter: 100000
snapshot: 10000
gamma: 0.5
stepsize: 200000
snapshot_prefix: "model/DR2_stage1"

I simply write this:

learning_rate = 0.001
optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate , momentum=0.9)
scheduler = StepLR(optimizer, step_size=200000, gamma=0.5)

How can I apply first parameters to this code?

Could you explain what you mean by first parameters?