Replacing cross_entropy loss with smooth_l1_loss

I ma trying to replace torch.nn.functional cross_entropy loss with smooth_l1_loss.
Below line of code I have used for this-

loss = F.smooth_l1_loss(logits, label)

However I am getting error like- UserWarning: Using a target size (torch.Size([729])) that is different to the input size (torch.Size([729, 27])). This will likely lead to incorrect results due to broadcasting. Please ensure they have the same size. RuntimeError: The size of tensor a (27) must match the size of tensor b (729) at non-singleton dimension 1

I know that both loss function need input in seperate tensor format. But how to solve this problem. Any work around for replacing cross_entropy with smooth_l1_loss or huber loss?

I don’t understand why you want to do this kind of replacement, since these are two functions commonly used for different kind of problems : classification vs regression.

But if you want, you can just give an acceptable shape to your tensors : this by using a single neuron at the last layer, so that logits will be compatible with label in terms of dimension.