Overfitting and hanging

Sure! Assume you want to add another linear layer as well as a dropout layer to renset50’s classifier (which is a single linear layer assigned to model.fc), you could replace it with a new nn.Sequential container and add the new layers as well as the original linear layer to it as seen here:

model = models.resnet50()

nb_features = model.fc.in_features
model.fc = nn.Sequential(
    nn.Linear(nb_features , nb_features ),
    nn.Dropout(),
    model.fc
)

x = torch.randn(2, 3, 224, 224)
out = model(x)