Hello,

I created an optimizer like so

```
model = Sequential(
Linear(in_features, 64),
ReLU(),
Dropout(p=.2),
Linear(64, 32),
ReLU(),
Dropout(p=.2),
Linear(32, 3),
ReLU(),
Linear(3, 1))
opt = torch.optim.SGD(
[
{'params': model[0].parameters(), 'lr': 0.1},
{'params': model[1:3].parameters(), 'lr': 0.01},
{'params': model[4:7].parameters(), 'lr': 0.001},
])
```

After I serialize `opt.state_dict()`

```
{'state': {},
'param_groups': [{'lr': 0.1,
'momentum': 0,
'dampening': 0,
'weight_decay': 0,
'nesterov': False,
'params': [139865343712712, 139865343712928]},
{'lr': 0.01,
'momentum': 0,
'dampening': 0,
'weight_decay': 0,
'nesterov': False,
'params': [139865343712640, 139865343713072]},
{'lr': 0.001,
'momentum': 0,
'dampening': 0,
'weight_decay': 0,
'nesterov': False,
'params': [139865343713144,
139865343713216,
139865343709616,
139865343710048]}]}
```

Which arguments should I pass SGD constructor to recreate the optimizer given `model`

and the above Json? I think if I was able to access the model’s parameters given the keys in params like `139865343712712`

, I could recreate the initial input, but I haven’t been able to do that.

I’m really having a hard time understanding the Optimizer class source code, any help would be greatly appreciated.

Thanks