I’m trying to reimplement a paper about bilinear pooling(Link hereHierarchical Bilinear Pooling). So I need to get the intermediate output of some convolutional layers. But when I tried this with DenseNet121, I found that it’s difficult to do that as the forward pass of DenseNet is quite different with VGG or ResNet. For example, I want to get the outputs of the
When I tried to achieve that in this way:
class DenseNet121(nn.Module): def __init__(self, num_classes=102): super().__init__() self.model = models.densenet121(pretrained=True) self.layer3 = nn.Sequential(*list(self.model.children()))[-2][:-2] self.layer4_1 = nn.Sequential(*list(self.model.children()))[-2][-2].denselayer1 self.layer4_2 = nn.Sequential(*list(self.model.children()))[-2][-2].denselayer2 num_ftrs = self.model.classifier.in_features self.model.classifier = nn.Sequential( nn.Dropout(0.3), torch.nn.Linear(num_ftrs, num_classes) ) # # Initialize the fc layers. torch.nn.init.kaiming_normal_(self.model.classifier.weight.data) if self.model.classifier.bias is not None: torch.nn.init.constant_(self.model.classifier.bias.data, val=0) def forward(self, X): X = self.layer3(X) X = self.layer4_1(X) X = self.layer4_2(X) return X
It would raise an error:
2056 return torch.batch_norm( 2057 input, weight, bias, running_mean, running_var, -> 2058 training, momentum, eps, torch.backends.cudnn.enabled 2059 ) 2060 RuntimeError: running_mean should contain 32 elements not 544
I know this error may cause by input dimension mismatch of batch_norm(without adding the num_features before), how can I fix this? Or I made some mistakes somewhere?
I also tried to modify the forward process of the original network in the same way. It turned out the same.
Note: What I really need is the output of
model.features.denseblock4.denselayer16. I’ll appreciate that if anyone can tell me how to achieve this. Thanks in advance @ptrblck