Breaking up densenet121 results in a size mismatch?

Hi,

I am trying to create a custom nn module that wraps a pre-trained densenet121, so that I can access activations at the end of the convolutional part of the network.

As a starting point, I have just done the following:

import torch
from torchvision import models
import torch.nn as nn

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()

        self.model = models.densenet121(pretrained=True)

        self.features = self.model.features
        self.classifier = self.model.classifier

    def forward(self, x):
        x = self.features(x)
        x = self.classifier(x)
        return x

example = torch.ones((1,3,224,224))

model = MyModel()
output = model.forward(example)

However, I get a mismatch error for line x = self.classifier(x) of the forward method:

RuntimeError: size mismatch, m1: [7168 x 7], m2: [1024 x 1000] at ../aten/src/TH/generic/THTensorMath.cpp:961

Also, as far as I can tell, densenet121 expects (batch_size, 1024) inputs to its classifier part, however when I print the size of x after x = self.features(x) in the forward method, it is (1, 1024, 7, 7)?

Thank you for the help, I am still very new to pytorch.

Hi,

From the original densenet code, It looks like you’re missing an average pooling layer between the features and classifier

1 Like