How to globally prune deep neural network

I am trying to prune my model.

class EncoderCNN(nn.Module):

def __init__(self):
    super(EncoderCNN, self).__init__()
    cnn = models.vgg19(pretrained = False)
    modules = list(cnn.children())[:-2]
    self.cnn = nn.Sequential(*modules)
    self.enc_dim = list(cnn.features.children())[-3].weight.shape[0]
    self.avg_func = torch.nn.AvgPool2d(kernel_size=7, stride=1, padding=0)
    #self.tagClassifier=cnnModel
    #self.classifier_dim=list(cnnModel.features.children())[-1].weight.shape[0]
    
def forward(self, x):
    x = self.cnn(x)         
    avg_features = self.avg_func(x).squeeze()
    x = x.permute(0, 2, 3, 1)
 
    #y= self.tagClassifier(x)
    return x , avg_features

This is the code snippet of the model. for pruning i am trying to specify the parameters like
parameters_to_prune = (
(encoderCNN.cnn, ‘0.0.weight’),
(encoderCNN.cnn, ‘0.2.weight’),
)

And got an obvious error as AttributeError: ‘Sequential’ object has no attribute ‘0.0.weight’
Error i do understand but how can i access these parameters that i am not sure and needs help for that

1 Like

I got the solution by writing it as

parameters_to_prune = (
(encoderCNN.cnn[0][0],‘weight’),
(encoderCNN.cnn[0][2], ‘weight’),
)

1 Like