Transfer learning with ResNet: very low accuracy

I’m trying to use ResNet (18 and 34) for transfer learning. Although my loss (cross-entropy) is decreasing (slowly), the accuracy remains extremely low. My model is the following:

class ResNet(nn.Module):
    
    def __init__(self):
        super().__init__()
        
        # Download pre-trained ResNet model and remove FC layer
        resnet = torchvision.models.resnet18(pretrained=True)
        modules = list(resnet.children())[:-1]
        resnet = nn.Sequential(*modules)
        for param in resnet.parameters():
            param.requires_grad = False
        self.features = resnet
        
        # Add FC layer(s) for classification
        self.fc1 = nn.Linear(512, 1024)
        torch.nn.init.xavier_uniform_(self.fc1.weight)
        
        self.fc2 = nn.Linear(1024, 2048)
        torch.nn.init.xavier_uniform_(self.fc2.weight)
        
        self.fc3 = nn.Linear(2048, 2048)
        torch.nn.init.xavier_uniform_(self.fc3.weight)
        
        self.fc4 = nn.Linear(2048, NUM_CLASSES)
        torch.nn.init.xavier_uniform_(self.fc4.weight)
        
    def forward(self, x):
        out = F.relu(self.features(x))

        out = F.relu(self.fc1(out.view(-1, 512)))
        out = F.relu(self.fc2(out))
        out = F.relu(self.fc3(out))
        out = self.fc4(out)
        
        return out

net = ResNet().to(device)

I’m not using dropout at the moment since I’m trying to overfit the training data. It’s the first time I try to use a pre-trained model: can you spot anything wrong in my code?

What about the performance if you do not use a pretrained model?
If that’s all right, then the problem is probably about imput normalization

Please check the following post for more details.

1 Like

i’m also gettin the same issue using Densnet, ResNet152.