How to adjuist the learning rate in a custome way?

Hi, guys,
I want to realize adjusting the learning rate in a custom way, while the Docs only provide certain fashioned methods as PyTorch – How to adjust learning rate. So, I am wandering whether this is a tutorial or demo which I can follow to define a custom strategy of adjuisting the learning rate?

Any guide or answer will be appreciated!

You could check, if LambdaLR would fit your use case and if not, you could derive a custom learning rate scheduler using any of the provided schedulers as a template.

1 Like

Let me see this reference material.

>>> # Assuming optimizer has two groups.
>>> lambda1 = lambda epoch: epoch // 30
>>> lambda2 = lambda epoch: 0.95 ** epoch
>>> scheduler = LambdaLR(optimizer, lr_lambda=[lambda1, lambda2])
>>> for epoch in range(100):
>>>     train(...)
>>>     validate(...)
>>>     scheduler.step()

If I use the scheduler like the given example, would the lr be processed sequentially by lambda1 and lambda2 like a pipeline manner?