If requires_grad is set to false, you are freezing the part of the model as no changes happen to its parameters. In the example below, all layers have the parameters modified during training as requires_grad is set to true.
import torch, torchvision
import torch.nn as nn
from collections import OrderedDict
model = torchvision.models.resnet18(pretrained=True)
for param in model.parameters():
param.requires_grad =True
for name, param in model.named_parameters():
print('Name: ', name, ’ Requires_Grad: ', param.requires_grad)
Try changing the highlighted bold text in above code to False and you will see the model is frozen except for the last ‘fc’ layer, which you have modified.