Hi, I want to adjust the learning rate of part of my model, let’s call it PartA using lr_schedulerA
And PartB using lr_schedulerB.
I didn’t find a way to do this, the only solution I found is to duplicate my optimizer, and put the parameters of each part in the corresponding optimizer:
optimizerA = torch.optim.SGD(parametersA, args.lr,
momentum=args.sgd_momentum,
weight_decay=args.weight_decay)
optimizerB = torch.optim.SGD(parametersB, args.lr,
momentum=args.sgd_momentum,
weight_decay=args.weight_decay)
And then :
lr_schedulerA = torch.optim.lr_scheduler.MultiStepLR(optimizerA,
milestones=[100, 150], last_epoch=args.start_epoch - 1)
lr_schedulerB = torch.optim.lr_scheduler.ExponentialLR(optimizerB,gamma=0.99)
Anyone holds a better idea to share with me?
Thanks!