I have seen that is possible to set diferent learnign rates for diferent layers using Per-parameter options. But in my case, I have a lot of diferent modules, and I only want to set a diferent learnign rate for just a layer.
optim.SGD([{'params': [param for name, param in model.named_parameters() if 'fc2' not in name]}, {'params': model.fc2.parameters(), 'lr': 5e-3}], lr=1e-2)
Thanks for the reply. If I understand correctly, the example code is the initialization of an optimizer. But I mean, when I already have an optimizer, and after several epochs, I want to change the learning rate of a specific layer only, let’s say layer “conv2.2”, is there a way to directly change it in the optimizer, given the layer’s name?
To do this for a specific layer after several epochs, you should be able to combine @ptrblck’s solution with torch.optim.lr_scheduler.LambdaLR. You can supply a list of lambdas, one for each parameter group in the original optimizer. Put the update you want to make in the corresponding lambda. More info and the first below example can be found in the docs. https://pytorch.org/docs/stable/optim.html
>>> # Assuming optimizer has two groups.
>>> lambda1 = lambda epoch: epoch // 30
>>> lambda2 = lambda epoch: 0.95 ** epoch
>>> scheduler = LambdaLR(optimizer, lr_lambda=[lambda1, lambda2])
>>> for epoch in range(100):
>>> train(...)
>>> validate(...)
>>> scheduler.step()
To add to that, here is an example of a per-epoch lambda, with changes at epochs 10, 30, and 50:
# Multiply the original LR by 0.5 after epoch 10, etc.
lambda_epoch = lambda e: 1.0 if e < 10 else (0.5 if e < 30 else 0.25 if e < 50 else (0.1))