How can I save the optemizer setting in a log ? I tried doing print_log("=> optimizer '{}'".format(optimizer), log)
but I only got :
=> optimizer ‘<torch.optim.adadelta.Adadelta object at 0x7f9ed3cd4978>’
I need to save the settings using which the model was trained, things such as the learning rate, weight decay, and if I use specific optimizers such as Adadelta, its different parameters.
Is there a way to get around this ?
Thanks alot