@apaszke I am trying to fine-tune a resnet18. However I would like to freeze all layer except for the classification layer and the the convolution layer just before the average pooling.
Upon using the following code:
"""
model_ft = models.resnet18(pretrained=True)
lt=8
cntr=0
for child in model_ft.children():
cntr+=1
if cntr < lt:
# print child
for param in child.parameters():
param.requires_grad = False
num_ftrs = model_ft.fc.in_features
model_ft.fc = nn.Linear(num_ftrs,2)
optimizer_ft = optim.SGD(filter(lambda p: p.requires_grad, model_ft.parameters()), lr=0.001, momentum=0.9)
“”
with the code above all the convolution units in the last block gets to trainable mode. How do I make sure that only the last convolution unit within this block, the avg pool and classification layer gets into trainable mode while the rest are frozen? Any suggestions