Cannot freeze model - requires grad is always true

I have tried to freeze part of my model but it does not work. Gradient computation is still enabled for each layer. Is that some sort of bug or am I doing something wrong? :slight_smile:

model = models.resnet18(pretrained=True)

# To freeze the residual layers
for param in model.parameters():
    param.require_grad = False
for param in model.fc.parameters():
    param.require_grad = True
# Replace last layer        
num_features = model.fc.in_features
model.fc = nn.Linear(num_features, 2)
model.fc = nn.Dropout(0.5)
# Find total parameters and trainable parameters
total_params = sum(p.numel() for p in model.parameters())
print(f'{total_params:,} total parameters.')
total_trainable_params = sum(
    p.numel() for p in model.parameters() if p.requires_grad)
print(f'{total_trainable_params:,} training parameters.')

21,284,672 total parameters.
21,284,672 training parameters.

The line model.fc = nn.Dropout(0.5) seems to make no sense, since it completely changes model.fc from a linear layer with trainable parameters to a layer without trainable parameters (nn.Dropout)

Declare it as another layer:

model.fc = nn.Linear(num_features, 2)
model.dropout = nn.Dropout(0.5)

Or combine the two like this.

model.fc = nn.Sequential(
				nn.Linear(num_features, 2),
				nn.Dropout(0.5)
			)  

And let me know if it has changed.

1 Like

You are creating a new .require_grad attribute for each parameter, while you most likely want to set the .requires_grad attribute to False for the parameters (note the missing s).