How to initialize weights in custom layers while keeping pretrained layers untouched

Hi guys. Consider the following code:

class Net(nn.Module):
    def __init__(self):
        super().__init__()

        googlenet = torchvision.models.googlenet(pretrained=True)
        self.features= nn.Sequential(*list(googlenet.children())[0: 9])

        self.conv1 = nn.Conv2d(512, 256, 3, 1, padding=1)
        self.conv2 = nn.Conv2d(256, 256, 3, 1, padding=1)
        self.conv3 = nn.Conv2d(256, 256, 3, 1, padding=1)
        self.conv4 = nn.Conv2d(256, 256, 3, 1, padding=1)
        self.conv5 = nn.Conv2d(256, 128, 3, 1, padding=1)

        self.conv6 = nn.Conv2d(128, 512, 1, 1)
        self.conv7 = nn.Conv2d(512, 100, 1, 1)

I want to know how to initialize the self.conv layers from normal distribution and at the same time keep the pretrained(self.features) layers untouched. Should I do it manually?
Thanks.

To initialize parameters you could use the torch.nn.init methods on the parameters and you can freeze the self.features module by setting the .requires_grad attribute to False for its parameters.

1 Like