Model Created With *list, .children(), and nn.sequential Produces Different Output Tensors

I’m currently trying to use a pretrained DenseNet in my model. I’m following this tutorial: https://pytorch.org/hub/pytorch_vision_densenet/, and it works well, with an input of [1,3,244,244], it returns a [1,1000] tensor, exactly as expected.
However, currently I’m using this code to load a pretrained Densenet into my model, and use it as a “feature extraction” model. This is the code in the init function

base_model = torch.hub.load('pytorch/vision:v0.10.0', 'densenet121', pretrained=True)

self.base_model = nn.Sequential(*list(base_model.children())[:-1])

And it is being used like this in the forward function

x = self.base_model(x)

This however, taking the same input, returns a tensor of the size: ([1, 1024, 7, 7]). I can not figure out what is not working, I think it is due to the fact that DenseNet connects all the layers together, but I do not know how to get it to work in the same method. Any tips in how to use pretrained DenseNet in my own model?

Using the mentioned approach by re-wrapping modules into an nn.Sequential container might break for models which are using the functional API in their forward or any other approach which is not initializing and applying all layers in a sequential way.
In your case, you are missing these functional API calls, which is why the output shape differs.

Thank you for your response! I guess next time I should take a look at the source code to see which layers are missing.