A way to do this is to create a new model that only uses DenseNet’s features block:
model = densenet121(pretrained=True)
class FeatureExtractor(nn.Module):
def __init__(self):
super(FeatureExtractor, self).__init__()
self.features = nn.Sequential(*list(model.features.children()))
def forward(self, x):
x = self.features(x)
return x
model_features = FeatureExtractor()
Now model_features has the same architecture and parameters as the original pre-trained DenseNet, except that the fully-connected classifier on top has been removed.