Different learning rate within a layer

Hi there, may I ask if I want to get different learning rate for different neurons within a layer, is there any code example to implement it? Thank you.

I’m not sure whether that is what you want or not, but if you have modules with named parameters, you can do something like:

L = nn.Linear(5,3)
optim.SGD([
                {'params': L.weight },
                {'params': L.bias, 'lr': 1e-3}
            ], lr=1e-2, momentum=0.9)

You can read torch.optim doc for more information.

Hi mMagmer, thank you for your reply. Not really my case.
What I want to set different learning rate on weights in the same layer. For example, in your example
{‘params’: L.weight }, will update all the weights of L at the same lr. Is it possible to let different weights have different lr? Thank you.

I don’t know, maybe you can write your own custom optimizer.