CyclicLR doesn't work with Adam

line 60, in __init__
    cycle_momentum=False)
  File "/Users/boris/conda/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 593, in __init__
    self.base_momentums = list(map(lambda group: group['momentum'], optimizer.param_groups))
  File "/Users/boris/conda/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 593, in <lambda>
    self.base_momentums = list(map(lambda group: group['momentum'], optimizer.param_groups))
KeyError: 'momentum'

That’s because Adam doesn’t have momentum like SGD does. Setting cycle_momentum=False is supposed to solve this problem but it does not. Either I am doing something wrong or there’s a bug in the code.

I’ve run into this as well, it seems like this is a bug and will be fixed in the near future, see this issue. I’m currently trying to figure out if this can be circumvented by monkey patching the Adam class somehow.

1 Like

Any updates from this thread? Have the same problem using CyclicLR with Adam

You need to pass cycle_momentum=False when you initialize the CyclicLR instance.