How the pytorch freeze network in some layers, only the rest of the training?

if you are using a free trained model let say resnet50.

import torchvision.models as models
model = models.resnet50(pretrained=True)
For accessing different layers.
model.layer1
model.layer2
model.layer3
model.layer4

Setting gradient to False at layer1.

for param in model.layer1.parameters():
param.requires_grad = False

Confirming whether is frozen or not.

for param in model.layer1.parameters():
print(param.requires_grad)

Hope this would help.

Do you mean just add A.parameters() into optimizer when it was declared ?


frozen_layers = [model.module.down.ops.0.conv_block.adn.N.weight]
for layer in frozen_layers:
    for name, value in layer.named_parameters():
        value.requires_grad = False

  File "<ipython-input-31-a9e45f05e3b6>", line 2
    frozen_layers = [model.module.down.ops.0.conv_block.adn.N.weight]
                                           ^                                                          
SyntaxError: invalid syntax

I am facing issue to freeze layer named ‘module.down.ops.0.conv_block.adn.N.weight’ I think because of .0. in it?. Please let me know how to freeze such layers?

Thanks in advance!

Not sure about this specific attribute, but if it’s actually 0, you can do something like this to access it:

model.module.down.ops.__getattribute__('0').conv_block.adn.N.weight
1 Like

Can’t you just do:

model.sub_module.eval()

?

The code told me that model.named_children() is no need. So one more simple method is,

def dfs_freeze(model):
    for param in model.parameters():
        param.requires_grad = False
    return model

Hi.
I am wondering there are 50 layers in resnet50.
How come 10 layers exist 10 layers as you freeze that??