Dimensionality of batches

When creating batches

train_iter = Iterator(trn, batch_size=4)

I get the dimensionality of the batch for the target is torch.Size([4, NumOfClasses]) while dimensionality of batch for predictors is torch.Size([NumOfWordsInSentence, 4]).

This makes a problem in my code to use CrossEntropyLoss() which requires that the first dimension to be the batch size.

Where am I wrong?
I expected the first dimension both for target and predictors to be 4.

E.g. in my code:

for i, batch in enumerate(train_iter):
    if i == 5:
        print(batch.text.size(), batch.category.size())

# Output
Generated string:  ['moon', 'earth', 'world', 'world', 'hello']
Length =  5
torch.Size([6, 4]) torch.Size([4, 3])
torch.Size([9, 4]) torch.Size([4, 3])
torch.Size([7, 4]) torch.Size([4, 3])
torch.Size([5, 4]) torch.Size([4, 3])
torch.Size([9, 4]) torch.Size([4, 3])

Full code: https://gist.github.com/av-maslov/1802eacdf8ceb4704de4deee77fae8e0