line 60, in __init__
cycle_momentum=False)
File "/Users/boris/conda/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 593, in __init__
self.base_momentums = list(map(lambda group: group['momentum'], optimizer.param_groups))
File "/Users/boris/conda/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 593, in <lambda>
self.base_momentums = list(map(lambda group: group['momentum'], optimizer.param_groups))
KeyError: 'momentum'
That’s because Adam doesn’t have momentum like SGD does. Setting cycle_momentum=False is supposed to solve this problem but it does not. Either I am doing something wrong or there’s a bug in the code.