Hello I have seen some forum about Learning decay in pytorch for example in here . They said that we can adaptivelly change our learning rate in pytorch by using this code.
def adjust_learning_rate(optimizer, epoch):
"""Sets the learning rate to the initial LR decayed by 10 every 30 epochs"""
lr = args.lr * (0.1 ** (epoch // 30))
for param_group in optimizer.param_groups:
param_group['lr'] = lr
*) My question is has it been implemented in pytorch version 0.1 or 0.2 as default feature?, Or we must manually defined like the code? .
*) If we must manually defined like that function may I know your experiences the best epoch for dropping learning rate?, for example in that code is every 30 epoch.
-Thank you-
def poly_lr_scheduler(optimizer, init_lr, iter, lr_decay_iter=1,
max_iter=100, power=0.9):
"""Polynomial decay of learning rate
:param init_lr is base learning rate
:param iter is a current iteration
:param lr_decay_iter how frequently decay occurs, default is 1
:param max_iter is number of maximum iterations
:param power is a polymomial power
"""
if iter % lr_decay_iter or iter > max_iter:
return optimizer
lr = init_lr*(1 - iter/max_iter)**power
for param_group in optimizer.param_groups:
param_group['lr'] = lr
return lr