How to compute Cross Entropy Loss for sequences

I have a sequence continuation/prediction task (input: a sequence of class indices, output: a sequence of class indices) and I use Pytorch.

My neural network returns a tensor of shape (batch_size, sequence_length, numb_classes) where the entries are a number proportional to the propability that the class with this index is the next class in the sequence. My targets in the training data are of shape (batch_size, sequence_length) (just the sequences of the real predictions).

I want to use the CrossEntropyLoss

My question: How do I use the Cross Entropy Loss function? Which input shapes are required?

Thank you!

i did this once with the lstm model. The approach is similar to that of any cross entropy loss approach. Give the predicted as (batch_size/sequence length, num_classes) and the target is (sequence length/ gt class for each step in one dimensional)…