Hi, I was trying to implement this:
i.e., linearly increase learning rate every epoch until a certain target. Any advice on how to do this with the Adam Optimizer, and the LR Scheduler?
Thanks!
Hi, I was trying to implement this:
i.e., linearly increase learning rate every epoch until a certain target. Any advice on how to do this with the Adam Optimizer, and the LR Scheduler?
Thanks!