How the pytorch freeze network in some layers, only the rest of the training?

Each parameters of the model have requires_grad flag:
http://pytorch.org/docs/master/notes/autograd.html

For resnet example in the doc, this loop will freeze all layers

for param in model.parameters():
    param.requires_grad = False

For partially unfreezing some of the last layers, we can identify parameters we want to unfreeze in this loop. setting the flag to True will suffice.

69 Likes