Hello I have seen some forum about Learning decay in pytorch for example in here . They said that we can adaptivelly change our learning rate in pytorch by using this code.
def adjust_learning_rate(optimizer, epoch): """Sets the learning rate to the initial LR decayed by 10 every 30 epochs""" lr = args.lr * (0.1 ** (epoch // 30)) for param_group in optimizer.param_groups: param_group['lr'] = lr
*) My question is has it been implemented in pytorch version 0.1 or 0.2 as default feature?, Or we must manually defined like the code? .
*) If we must manually defined like that function may I know your experiences the best epoch for dropping learning rate?, for example in that code is every 30 epoch.