Model not getting better after each epoch

Using transfer learning for a multi-label image classification task (changed the classifier layer of resnet50).

I have changed the fully connected layer of resnet50 as follows :

model = models.resnet50(pretrained=True)
n_inputs = model.fc.in_features
model.fc = nn.Sequential(
                      nn.Linear(n_inputs, 256), 
                      nn.Linear(256, n_classes),                   
model =
def fit(n_epochs):
  for epoch in range(n_epochs):

      running_loss = 0.0
      for i, (inputs,labels) in enumerate(Bar(train_loader)):
          inputs, labels =,

          outputs = model(inputs)
          loss = criterion(outputs, labels)

          running_loss += loss.item()
      print("Finished epoch {}; running_loss =", running_loss)

  print('Finished Training' )
4794/4794: [===============================>] - ETA 0.1s
Finished epoch {}; running_loss = 1571.782483279705
Finished Training
4794/4794: [===============================>] - ETA 0.1s
Finished epoch {}; running_loss = 1571.5039553046227
Finished Training

My loss functions and optimizers are

criterion = nn.NLLLoss()
optimizer = optim.Adam(model.parameters(), lr=0.00001)

After each epoch, my model is not getting any better. Is there any mistake in my training loop

The training loop looks alright.

Are you working on a multi-label (each sample belongs to 0, 1 or more classes) or multi-class (each sample belongs to a single class only) classification?
In the former case nn.NLLLoss is most likely the wrong criterion to use.
Also, what did you change in the resnet? Since you are using nn.NLLLoss, you would have to add F.log_softmax as the last activation function.

I am working on a multi-label classification problem, there are 4 classes in total.
And I changed nn.NLLLoss to nn.CrossEntropyLoss and my model is training :grinning:

Should I use Adam or SGD optimizers.
Thanks a lot !!

Good to hear it’s working now!

I don’t have a general answer, so you would have to try both for your use case. :wink:

Sure, thanks a lot :slightly_smiling_face: