Learning rate scheduling in PyTorch

I was trying to train a model and adjust the learning rate after a few epochs to see the effects on training.
Specifically, I’d like to adjust the learning rate when the loss function ‘plateus’
Is there a way to do this in PyTorch?

Read the docs: https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate

There are many ways to do this, including plateau.

1 Like

Thank you! I somehow couldn’t find this, my bad

1 Like