I was wondering how I would be able to reset the scheduler and optimizer to the parameters that they were initially initialized with.
The reason I need this is that I’m currently injecting these objects into a Trainer object which handles the K-fold validation training. Everything is fine on the first fold, but when I reach the second fold the optimizer still uses the learning rate that it has from the previous fold.
The only way that I have thought it get around this is to reinitialize the optimizer and scheduler, though this would defeat the purpose of the dependency injection.