I want to apply torch.optim.lr_scheduler.CosineAnnealingWarmRestarts() to my self-defining optimizer. However, the error occurs TypeError: XXX is not an Optimizer(XXX is my custom optimizer). I creat my optimizer by modifying torch.optim.SGD and it works well. How to solve this problem?
pytorch/lr_scheduler.py at master · pytorch/pytorch · GitHub shows that the TypeError will be raised when your customized optimizer is not an instance
Does your customized optimizer subclass the base class
Yes. My customized optimizer is an instance Optimizer. I directly modified the torch.optim.SGD code to create my customized optimizer.
You should subclass the base
Optimizer class imported from
from torch.optim.optimizer import Optimizer
Thanks! That works! I find that I import the old version optimizer: from Grad_optimizer import Optimizer.