How the pytorch freeze network in some layers, only the rest of the training?

This snippet may clarify how to do it.

Set requires_grad to false you want to freeze:

# we want to freeze the fc2 layer
net.fc2.weight.requires_grad = False
net.fc2.bias.requires_grad = False

Then set the optimizer like the following:

optimizer = optim.SGD(filter(lambda p: p.requires_grad, net.parameters()), lr=0.1)

Alternatively, you can only add the parameters you want to train to the optimizer:
https://discuss.pytorch.org/t/to-require-grad-or-to-not-require-grad/5726

But I think the above method is more straight-forward.

31 Likes