How to apply CrossEntropy with top-label K?

out = classifier(model(input))
_, topk_label = torch.topk(out, k=2, dim=-1)
topk_label = topk_label.squeeze()
top1_label, top2_label = topk_label.t()[0], topk_label.t()[1]

ce = nn.CrossEntropyLoss()
loss = ce(top1_label, gt_labels)
loss = ce(top2_label, gt_labels)
...

As far as I know, nn.CrossEntropyLoss don’t allow the predictions (1st parameter) with one-hot prediction.
How can I apply cross entropy loss with one-hot predictions (top1_label or top2_label) ?