Changing the "milestones" of a Saved multisteplr Optimizer

I have my optimizer’s state_dict(), my model.state_dict() along with epoch number saved, and I want to add more milestones when I load the model and optimizer and start training.

The current milestones end in 4000 epoch, and I’d like to add 4250, 4500, 4750 etc.

Is it possible to load, change, and continue the training without messing up anything?

1 Like

It should be fine just to load a new scheduler if you need to change the milestones, the implementation is pretty lightweight (see the LRScheduler and MultiStepLR source here). Just make sure to pass the last epoch into the last_epoch parameter in the initialization of MultiStepLR, otherwise the scheduler will assume that it’s on the first epoch.

2 Likes