Is it possible to give a lower learning rate on specific weight of specific layer?

Is it possible to give a lower learning rate on specific weight of specific layer?
e.g.)
if the weight index list = [1,2,3,4,5]
and give convolution layer 5 weight index list to 0.1 learning rate
and give other convolution layer 5 weight to 0.01

Is it possible?

I think you can do it by defining two optimizers and passing the desired params to each one with different learning rates:

params1 = ...
params1 = ...
opt1 = torch.optim.Adam(params1, lr1)
opt2 = torch.optim.Adam(params2, lr2)

or you can also do it like this, from pytorch tutorials here:

optim.SGD([
                {'params': model.base.parameters()},
                {'params': model.classifier.parameters(), 'lr': 1e-3}
            ], lr=1e-2, momentum=0.9)

This means that model.base ’s parameters will use the default learning rate of 1e-2 , model.classifier ’s parameters will use a learning rate of 1e-3 , and a momentum of 0.9 will be used for all parameters.

1 Like

Thanks! I’ll give it a try :clap: