I ma trying to replace torch.nn.functional cross_entropy loss with smooth_l1_loss.
Below line of code I have used for this-
loss = F.smooth_l1_loss(logits, label)
However I am getting error like- UserWarning: Using a target size (torch.Size([729])) that is different to the input size (torch.Size([729, 27])). This will likely lead to incorrect results due to broadcasting. Please ensure they have the same size. RuntimeError: The size of tensor a (27) must match the size of tensor b (729) at non-singleton dimension 1
I know that both loss function need input in seperate tensor format. But how to solve this problem. Any work around for replacing cross_entropy with smooth_l1_loss or huber loss?