Finer control for Freezing layers in resnet

Hi;

I would like to use fine-tune resnet 18 on another dataset. I would like to do a study to see the performance of the network based on freezing the different layers of the network.

As of now to make make all the layers learnable I do the following

model_ft = models.resnet18(pretrained=True)
num_ftrs = model_ft.fc.in_featuresmodel_ft.fc = nn.Linear(num_ftrs, 2)

To make all layers learnable

optimizer_ft = optim.SGD(model_ft.parameters(), lr=0.001, momentum=0.9)

To make only classification layer learnable

for param in model_ft.parameters():
param.requires_grad = False

num_ftrs = model_ft.fc.in_features
model_ft.fc = nn.Linear(num_ftrs, 2)

optimizer_ft = optim.SGD(model_ft.fc.parameters(), lr=0.001, momentum=0.9)

To make the block and the classification layer learnable:

lt=8
cntr=0

for child in model_ft.children():
cntr+=1

if cntr < lt:
    # print child
    for param in child.parameters():
        param.requires_grad = False

num_ftrs = model_ft.fc.in_features
model_ft.fc = nn.Linear(num_ftrs,2)
optimizer_ft = optim.SGD(filter(lambda p: p.requires_grad, model_ft.parameters()), lr=0.001, momentum=0.9)

Now my query is within a block there are few convolution layer, how do I access them and set them to freeze.

you may refer to

By the way, try format your code with markdown sytanx:

```
code here
```

@chenyuntc Thank you. But this still does not answer the my problem as the solution either makes the entire block trainable or frozen. I would like enter the block and keep only the classification layer, the average pooling layer and last the convolution layer in the last block trainable.

It may work for you. please see this part of the page I send in the following link (github page):

Freezing parameters of some layers to prevent them from retraining

2 Likes

Oh this is exactly what I was looking out for. Thank you