/torch/optim/lr_scheduler.py:28): UserWarning: The verbose parameter is deprecated

How can I get rid of this warning? I’m on 2.2.0+cu121.

/Lib/site-packages/torch/optim/lr_scheduler.py:28): UserWarning: The verbose parameter is deprecated. Please use get_last_lr() to access the learning rate. warnings.warn("The verbose parameter is deprecated. Please use get_last_lr() "

Don’t pass the verbose argument to the scheduler as explained in the warning.

1 Like

Tell me, please, how can I now receive information in the console that ReduceLROnPlateau has processed and changed the LR?
I’m using lightning, and implementing the configure_optimizers method this way:

    def configure_optimizers(self):
        optimizer = torch.optim.AdamW(lr=self.lr, params=self.parameters())
        return {
            "optimizer": optimizer,
            "lr_scheduler": {
                "scheduler": ReduceLROnPlateau(
                    optimizer=optimizer,
                    verbose=True,
                    min_lr=0.000000001,
                    patience=self.lr_scheduler_patience,
                    threshold=0.001,
                    factor=0.1
                ),
                "monitor": "val_loss",
                'interval': "step",  # "epoch",  # 'step' or 'epoch'
                'frequency': self.lr_scheduler_freq,
            },
        }

Before the update, verbose worked as expected and, after N validations without improvements, the LR changed, as was written in the console.
Now, when I start a training, I get a warning as described above. The loss change graph shows when ReduceLROnPlateau is working, but no logging occurs in the console. To be honest, it is not clear how to return such logging with new changes.

1 Like

Did you figure it out?

Nope, still waiting ptrblck’s answer

.get_last_lr() is the new and recommended way to check the current learning rate. I don’t know enough about lightning and how they are printing the lr during the training.