Load settings from config file and use setattr for torch modules?

so i just want to create a config file, say, optimizer.config, like:

name: SGD
lr: 0.1
momentum:0.9

and i initialize my optimizer as below:

optimizer = getattr(torch.optim, config['name'])(filter(lambda p: p.requires_grad, model.parameters()),
                                                 lr=config.getfloat('lr')
        )

i don’t want to directly set the flag

momentum=config.getfloat('momentum')

because i may have more parameters and i want to put them in a loop to load.
but when i try to assign value to “momentum”, I found out seems i cannot use setattr:

setattr(optimizer, 'momentum', 0.9)

it does not work…is there any way that i can load settings from a config file for torch modules?

really thanks

Can you please mention if there is an error, or what is the behavior.
Also, please try:

momentum = getattr(optimizer, 'momentum')
momentum = 0.9

I am new to PyTorch, but I presume if the __eq__ operator is custom defined by PyTorch to do more than just assigning, if it also edits some computation graph elements, this should work.

no error, just cannot assign the value.
your solution is also not working.