Pytorch LSTM: Target Dimension in Calculating Cross Entropy Loss


I’ve been trying to get an LSTM (LSTM followed by a linear layer in a custom model), working in Pytorch, but was getting the following error when calculating the loss:

Assertion cur_target >= 0 && cur_target < n_classes' failed.

I defined the loss function with:

criterion = nn.CrossEntropyLoss()

and then called with

loss += criterion(output, target)

I was giving the target with dimensions [sequence_length, number_of_classes], and output has dimensions [sequence_length, 1, number_of_classes].

The examples I was following seemed to be doing the same thing, but it was different on the Pytorch docs on cross entropy loss.

The docs say the target should be of dimension (N), where each value is 0 ≤ targets[i] ≤ C−1 and C is the number of classes. I changed the target to be in that form, but now I’m getting an error saying (The sequence length is 75, and there are 55 classes):

Expected target size (75, 55), got torch.Size([75])

I’ve tried looking at solutions for both errors, but still can’t get this working properly. I’m confused as to the proper dimensions of target, as well as the actual meaning behind the first error (different searches gave very different meanings for the error, none of the fixes worked).



Try to permute your output and target so that the batch dimension is in dim0, i.e. your output should be [1, number_of_classes, seq_length], while your target should be [1, seq_length].