I want to linearly increase my learning rate using LinearLR followed by using ReduceLROnPlateau.
I assumed we could use SequentialLR to achieve the same as below
warmup_scheduler = torch.optim.lr_scheduler.LinearLR(
self.model_optim, start_factor=0.05, end_factor=1, total_iters=3
)
reduce_lrop_scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(
self.model_optim,
"min",
patience=self.reduce_lr_on_plateau_patience,
threshold=self.reduce_lr_on_plateau_threshold,
)
lr_scheduler = torch.optim.lr_scheduler.SequentialLR(
self.model_optim,
schedulers=[warmup_scheduler, reduce_lrop_scheduler],
milestones=[5],
)
As expected, the LR goes up from almost 0 and reaches a constant value, but then when I hit the first milestone
, I get an error saying ReduceLROnPlateau doesn’t have an attribute called get_last_lr
.
Full error below:
File "/home/<user>/miniconda3/envs/pyt/lib/python3.9/site-packages/pytorch_lightning/core/lightning.py", line 1560, in lr_scheduler_step
scheduler.step()
File "/home/<user>/miniconda3/envs/pyt/lib/python3.9/site-packages/torch/optim/lr_scheduler.py", line 647, in step
self._last_lr = self._schedulers[idx].get_last_lr()
AttributeError: 'ReduceLROnPlateau' object has no attribute 'get_last_lr'
I’m using PyTorch lightning to handle the optimisation but I assume the problem lies in incompatibility of ReduceLROnPlateau with SequentialLR. I looked around in different forums but couldn’t find a satisfactory answer.
Side note: I’d like the final learning rate to be 3e-5
after the warmup so I set the initial LR as 3e-5
and end_factor as 1
with initial factor being 0.05. This results in the final lr after warm up to be 1.5e-6 which is off by a factor of 20. I don’t quite understand why this happens, help on that would also be appreciated.
Thanks.
I’m on torch version 1.11.0+cu113, and pytorch-lightning version 1.6.4