How to confirm parameters of frozen part of network are not being updated?

I am new to PyTorch. I set the requires_grade for the features extraction layers of vgg16 to false (as I want to freeze these layers for fine tuneing the model) using following code:

for name, param in model.named_parameters():
if param.requires_grad and ‘features’ in name:
param.requires_grad = False

Is it the correct way of freezing some of the network’s layers? Also is there any way to confirm that the weights and biases are not being updated in the frozen part of the network?

You could check the .grad attribute of the frozen parameters after the backward() call and make sure it’s set to None. Additionally, you could also print the parameters before and after the optimizer.step() call to make sure they are not updated.

1 Like

Thanks for your response. I have no idea how to check .grad, please guide me.

You can directly access this attribute:

print(model.layer.weight.grad) # should print None
loss.backward()
print(model.layer.weight.grad) # should still print None if the layer is frozen

Thank you again, I checked that the .grad attribute of frozen layers before and after loss.backward() are none, so its mean the parameters of frozen layers will not be updated during finetuning.