Same Loss value for 15 to 20 iterations

I am working on Dog Breed Classification problem. I am using pretrained model for training.

# Creating a model
model = models.resnet50(pretrained=True)
for param in model.parameters():
    param.requires_grad = False
num_features = model.fc.in_features
fc_layers = nn.Sequential(
                nn.Linear(num_features, 4096),
                nn.ReLU(inplace=True),
                nn.Dropout(p=0.1),
                nn.Linear(4096, num_classes),
                nn.ReLU(inplace=True),
                nn.Dropout(p=0.1),
            )
model.fc = fc_layers

This is how I have modified the pretrained model. I have used Adam as optimizer. And using a batchsize of 32.

optimizer = optim.Adam(model.parameters(), lr=0.001)
def criterion(yhat, y):
    label = yhat.gather(1, y.view(-1, 1)).squeeze()
    softmax_output = torch.exp(label)/torch.sum(torch.exp(yhat), axis = 1)
    loss = -torch.log(softmax_output)
    return loss.sum()

As the dataset is too large so in order to keep track of the loss I am printing loss after iteration.

In first epoch only I am getting same loss value after 15 to 20 iterations. The loss value is 153.109 so I don’t think this may be the convergence point. So I am not getting how to get out of this situation

I would recommend to remove the last ReLU and dropout layers and directly return the logits.
Also, what’s the reason you are implementing the loss manually instead of using nn.CrossEntropyLoss?

I got my answer. Thanks for your help @ptrblck