Current step from optimizer

Quick question: is there a way to get the current_step number from the optimizer, or does the user have to explicitly keep count? I am assuming it would maintain a step count inside it because that value would be required in annealing the LR.

optimizer.state_dict() does give out a bunch of stuff but the current_step does not seem to be one of them.

Thanks!

Hi,

Some optimizers using current_step like Adam manage it as step in each param_group of self.param_groups
See https://github.com/pytorch/pytorch/blob/master/torch/optim/adam.py#L72-L106.

On the other hand, optimizers like naive SGD do not manage current_step.

1 Like

step = optimizer.state[optimizer.param_groups[0]["params"][-1]]["step"]

1 Like