Any Easier Way for Optimizer Initialization for One Layer?

Hi, I am constructing an optimizer of resnet. As http://pytorch.org/docs/master/optim.html mentioned, we can only do default optimizer parameter setting on all layers or call every layer one by one.

The problem is, since I’m training on ResNet and I just want to modify learning rate of only ONE particular layer. What tricks I can do to initialize one layer specifically and setting others to default?

Thanks!

you can use the Per-parameter options: http://pytorch.org/docs/master/optim.html#per-parameter-options