Yes you can do that. You likely forgot to merge some dimensions before giving them to the Linear layer (try print(x.size()) before self.top(x) - it will be 4D). Just do self.top(x.view(-1, 512)).
Additionally, in case of ResNet, you could just replace the final layer with a new one and it should work I think:
model = models.__dict__[opt.arch](pretrained=True) #pretrained=False if you don't want to use pre-trained weights.
model.fc = nn.Linear(512, num_classes)
So what is the right way to add more than one layers instead of just replacing the existing one? And i also want to keep the original name of each predefined layer.
In init, if I use
feat = nn.Sequential(*list(resnet.children())[:-1])
fc1 = nn.Linear(a, b)
fc1 = nn.Linear(b, c)