Freezing more than one layer for transfer learning

Hi,
after trying a nice transfer learning tutorial I’m trying to get the right way of freezing a ResNet18 except not only the fully connected layer, but also the (layer4) block. Should I freeze the net and construct these layers newly? Then it’s unclear how to configure the optimizer instead of:
optimizer_conv = optim.SGD(model_conv.fc.parameters(), lr=0.001, momentum=0.9)
I’d greatly appreciate any advice.

2 Likes

You can try something like that :

optimizer=optim.SGD([{'params': model.classifier.parameters()},
                             {'params': model.features.parameters(), 'lr': 0.0}
                            ], lr=0.001, momentum=0.5)

So I’ll try something like:
for param in model.parameters():
param.requires_grad = False
for param in model.layer4.parameters():
param.requires_grad = True
for param in model.fc.parameters():
param.requires_grad = True

optimizer = torch.optim.SGD([{'params': model.layer4.parameters()},
        {'params': model.fc.parameters()}, args.lr,momentum=args.momentum,weight_decay=args.weight_decay)

Thanks a lot!