Using pretrained weights in custom model

I am currently working on implementation of this paper Fully Convolutional Networks for Semantic Segmentation. I am first trying to build a FCN-32 architecture. I am using VGG16 pretrained model for training PASCAL VOC 2012 dataset . I am having a doubt regarding how should I use pretrained weights of VGG16 in my custom class implementation’s feature’s section. I have created a code snippt which demonstrates the above scenario. Can anyone tell me if this implementation of using pretrained weights in custom model correct ?

#loading the pretrained VGG16 model

model1 = models.vgg16(pretrained=True)


#Freezing the layers except the fc layers

for param in model1.features.paramters():

    param.requires_grad = False


#Creating FCN custom module:

class FCN(nn.Module):

    def __init__(self):


        self.features = nn.Sequential(*list(model1.features.children()))

        self.classifier = nn.Sequential(nn.Conv2d(512,4096,7),






    def forward(self,x):

        x = self.features(x)

        x = self.classifier(x)

        return x


model2 = FCN()


#Again freezing the feature's layer of model2

for params in model2.features.parameters():

    params.requires_grad = False


#For confirming that all the pretrained weights from vgg16 are transfered to FCN custom model. must return true for confirmation

print(list(model2.features.parameters()) == list(model1.features.parameters()))