so i just want to create a config file, say, optimizer.config, like:
name: SGD
lr: 0.1
momentum:0.9
and i initialize my optimizer as below:
optimizer = getattr(torch.optim, config['name'])(filter(lambda p: p.requires_grad, model.parameters()),
lr=config.getfloat('lr')
)
i don’t want to directly set the flag
momentum=config.getfloat('momentum')
because i may have more parameters and i want to put them in a loop to load.
but when i try to assign value to “momentum”, I found out seems i cannot use setattr:
setattr(optimizer, 'momentum', 0.9)
it does not work…is there any way that i can load settings from a config file for torch modules?
really thanks