wonchulSon
(Wonchul Son)
September 11, 2019, 6:40am
1
I want to freeze several CNN layer,
how can I do?
1: optimizer does not include the parameters of CNN layer.
2: set the gard to 0 before updating model parameters or set requires_grad=False for the parameters of CNN before model training.
wonchulSon
(Wonchul Son)
September 11, 2019, 8:34am
3
@DoubtWang
Thank you for help.
But can you show me the example code…?
I’m beginner^^
spanev
(Serge Panev)
September 11, 2019, 11:40am
4
Hi @wonchulSon ,
Please find two threads about it here:
Setting .requires_grad = False should work for convolution and FC layers. But how about networks that have instanceNormalization? Is setting .requires_grad = False enough for normalization layers too?
I would like to do it the following way -
# we want to freeze the fc2 layer this time: only train fc1 and fc3
net.fc2.weight.requires_grad = False
net.fc2.bias.requires_grad = False
# passing only those parameters that explicitly requires grad
optimizer = optim.Adam(filter(lambda p: p.requires_grad, net.parameters()), lr=0.1)
# then do the normal execution of loss calculation and backward propagation
# unfreezing the fc2 layer for extra tuning if needed
net.fc2.weight.requires_grad = True
n…
The second one is doing exactly what @DoubtWang is suggesting above.
Let us know if you need more details about it!
Hi, @wonchulSon ,
I believe u can find some example and code in these threads given by @spanev .