Change learning of only one layer

Hello,

I have seen that is possible to set diferent learnign rates for diferent layers using Per-parameter options. But in my case, I have a lot of diferent modules, and I only want to set a diferent learnign rate for just a layer.

If tried something like this:

self._optimizer = optim.SGD([{'params': model.parameters()},{'params': model.fc2.parameters(), 'lr': 5e-3}], lr=1e-2)
        

But then I ger the following error: ValueError: some parameters appear in more than one parameter group

So, how could I create a list with the parameters I want from a specific layer and then the other parameters left without having duplicate parameters?

Thanks.

You could try to filter out your special layer:

optim.SGD([{'params': [param for name, param in model.named_parameters() if 'fc2' not in name]}, {'params': model.fc2.parameters(), 'lr': 5e-3}], lr=1e-2)
5 Likes

Thank you! That is just what I was looking for.

Is it possible to change the learning rate of a specific layer during training? I don’t see any “name to parameter mapping” in optimizer.

You would have to get the parameters of this specific layers. Is my example in this thread not working for your use case?

Thanks for the reply. If I understand correctly, the example code is the initialization of an optimizer. But I mean, when I already have an optimizer, and after several epochs, I want to change the learning rate of a specific layer only, let’s say layer “conv2.2”, is there a way to directly change it in the optimizer, given the layer’s name?

1 Like

To do this for a specific layer after several epochs, you should be able to combine @ptrblck’s solution with torch.optim.lr_scheduler.LambdaLR. You can supply a list of lambdas, one for each parameter group in the original optimizer. Put the update you want to make in the corresponding lambda. More info and the first below example can be found in the docs. https://pytorch.org/docs/stable/optim.html

>>> # Assuming optimizer has two groups.
>>> lambda1 = lambda epoch: epoch // 30
>>> lambda2 = lambda epoch: 0.95 ** epoch
>>> scheduler = LambdaLR(optimizer, lr_lambda=[lambda1, lambda2])
>>> for epoch in range(100):
>>>     train(...)
>>>     validate(...)
>>>     scheduler.step()

To add to that, here is an example of a per-epoch lambda, with changes at epochs 10, 30, and 50:

# Multiply the original LR by 0.5 after epoch 10, etc.
lambda_epoch = lambda e: 1.0 if e < 10 else (0.5 if e < 30 else 0.25 if e < 50 else (0.1))