Trying to remap a saved optimizer to a new param_group and trying to check valid shaping of tensors between the optimizers, but realized that the params are integers and not tensors. How should I think of this? The docs say optimizer.state_dict()[“param_groups”][*][“params”] should be tensors.
> torch.optim.SGD(nn.Linear(3, 2).parameters(), 0.1).state_dict()["param_groups"]
[{'lr': 0.1,
'momentum': 0,
'dampening': 0,
'weight_decay': 0,
'nesterov': False,
'maximize': False,
'foreach': None,
'differentiable': False,
'params': [0, 1]}] # <-- ints ¯\_(ツ)_/¯
What is the recommended way to validate that when I restore an optimizer that the parameters from param_group_orig
match param_group_new
[eg: different layerwise LR] ? I guess there is some link to the momentum terms as well somwehere behind the scenes?