Resetting scheduler and optimizer learning rate

I was wondering how I would be able to reset the scheduler and optimizer to the parameters that they were initially initialized with.

The reason I need this is that I’m currently injecting these objects into a Trainer object which handles the K-fold validation training. Everything is fine on the first fold, but when I reach the second fold the optimizer still uses the learning rate that it has from the previous fold.

The only way that I have thought it get around this is to reinitialize the optimizer and scheduler, though this would defeat the purpose of the dependency injection.

1 Like

there isn’t a way we provide to reset these parameters to initial state. Your best bet is to reinitialize them.

One thing to do is to have the Trainer take the constructor parameters needed to create the optimizer / scheduler and the trainer creates these objects itself. That way, if the trainer wants to re-initialize these objects, it has the necessary initial state to do so.

1 Like

Ended up passing a factory object into the experiment, thanks!