How to save the optimizer setting in a log in pytorch?

How can I save the optemizer setting in a log ? I tried doing
print_log("=> optimizer '{}'".format(optimizer), log)
but I only got :

=> optimizer ‘<torch.optim.adadelta.Adadelta object at 0x7f9ed3cd4978>’

I need to save the settings using which the model was trained, things such as the learning rate, weight decay, and if I use specific optimizers such as Adadelta, its different parameters.
Is there a way to get around this ?
Thanks alot

Hi,

you can save the optimizer setting using: optimizer.state_dict()

here is an example for saving:

        states = {
            'epoch': epoch + 1,
            'arch': opt.arch,
            'state_dict': model.state_dict(),
            'optimizer': optimizer.state_dict(),
        }
        torch.save(states, save_file_path)

here is an example for loading the settings:

optimizer.load_state_dict(checkpoint['optimizer']

Best Regards,
Tal

1 Like

Thanks, but I want to save them in a log.txt file .

You can still save the state_dict to a log.txt

the output of optimizer.state_dict() is:

{'state': {}, 'param_groups': [{'lr': 0.0001, 'betas': (0.9, 0.999), 'eps': 1e-08, 'weight_decay': 0, 'amsgrad': False, 'params': [2666935098032, 2666957524496]}]}

it’s a simple json so you can save it in any format you want and load it

1 Like

Thanks alot, thats it :slight_smile:
Greatly appreciated :slight_smile:

you’re welcome, glad i could help :slight_smile:

Hi thadar,

Sorry to bother you. I have a doubt that What “state” and “params” in “param_groups” stand for in optimizer’s state_dict?

Thanks~