What Im trying to do is to remove the last layer from the Inception with batch normalisation model. This model was presented in “Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift” where the authors modified the original GoogleNet model by adding BN layers in several places in the network.
I removed the last layer by applying nn.Sequential(*list(pretrained_model.children())[:-1]), as many others are doing. Bu my problem is that I use a specific version of the pretrained model which is constructed in such a way that gets the outputs of all Inception submodules and concatenates them (as the definition of Inception module in the GoogleNet paper). But this causes the following problem. Because I use nn.Sequential to keep all the layers but the last, the layers are put one after the other sequentially and the customised forward() function of my model is overriden and the simple forward function of nn.Sequential is called which causes wrong computations as the GoogleNet model computes multiple conv layers in parallel and then concatenates them, while nn.Sequential computes all these conv layers sequentially, and layers that should have been computed in parallel, are compute one on top of the other which of course causes input shape problems to the next layers.
Anyone that has an idea on how to solve such a problem? Remember that the problem starts from the fact that I want to remove the last layer of the model. I do nn.Sequential(*list(pretrained_model.children())[:-1]) but do you maybe have in mind a way to remove the last layer without calling nn.Sequential but do it directly in my model???