ValueError: Expected target size (4, 170), got torch.Size([4])

I am running into an issue of dimension mismatch after F.log_softmax layer when calculating the error. I am using a batch size of 4.

# inside forward()
fc_classify = fc_classify.view(self.batch_size, -1, fc_classify.shape[1])  # torch.Size([1108, 170]). This is the shape of fc_classify which is a fully connected layer
# 170 is the number of labels, self.batch_size=4

scores = F.log_softmax(fc_classify, dim=1)  #torch.Size([4, 277, 170]) of scores
# 277 is the max length of each sentence

loss_classification = nn.NLLLoss()

loss = loss_classification(label_pred, target_label) # where label_pred is the output of the log_softmax function and its shape is torch.Size([4, 277, 170]) whereas the target_label shape is torch.Size([4])

So the issue is with the additional dimension of 277. I don’t know how to solve this issue. This was working when I didn’t use any batching.