Confused about target_label when using nn.CrossEntropyLoss

Hello! I am a little confused when trying to calculate cross entropy. So here is my situation:
For example, the data has batch_size which is 2, and 3 classes to predict.

#               c1   c2   c3
pred_label = [[0.2, 0.5, 0.3]
              [0.3, 0.6, 0.1]]  # shape:[2,3]

The target label for the 1st sample in the batch is c1, and for the 2nd sample is c3.
The question is: What’s the correct form for target label tot use? In TensorFlow, it requires to use one-hot encoding to represent target value like this:

target_label = [[1, 0, 0]
                [0, 0, 1]
tf.nn.softmax_cross_entropy_with_logits_v2(pred_label, target_label)

I tried one-hot encoding in PyTorch, but got error.
It seems that PyTorch does not need one-hot encoding. Maybe I can use the original target label value directly?

target_label = [0,2] # class1 and class3 
torch.nn.CrossEntropyLoss(pred_label, target_label)

I am not if I understand right. :slight_smile:

Yes, you can use the class labels directly in PyTorch.

Yes, it’s very convenient! Thanks.