Shouldn't ReduceLROnPlateau() super optimizer in its init?

Regarding ReduceLROnPlateau():

optim = torch.optim.AdamW(model.parameters(), lr=param['lr'], amsgrad=True) 
scheduler = ReduceLROnPlateau(optimizer=optim, mode='max', patience=1, verbose=True, factor=0.5)
val_roc = torch.tensor(15.0, device=device)
scheduler.step(val_roc)
scheduler.get_last_lr()
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-39-f8112ad93830> in <module>
----> 1 scheduler.get_last_lr()

AttributeError: 'ReduceLROnPlateau' object has no attribute 'get_last_lr'

Inside def step() we do:
self._last_lr = [group['lr'] for group in self.optimizer.param_groups]

Yet there is no way to pull that out.

Class class _LRScheduler(object): has

    def get_last_lr(self):
        """ Return last computed learning rate by current scheduler.
        """
        return self._last_lr

But inside of ReduceLROnPlateau()'s init we do not have:

super(ReduceLROnPlateau, self).__init__(optimizer, last_epoch)

Im just trying to track the LR as my model builds, and encountered this. I think someone forgot to super
_LRScheduler?

You could create a feature request for get_last_lr for this scheduler and meanwhile use the 'lr' attribute of the parameter group of the optimizer directly:

print(optimizer.param_groups[0]['lr'])