Network return torch.int64 and target variable is float.64

RuntimeError: "log_softmax_lastdim_kernel_impl" not implemented for 'Long'

output = model(batch)
_, pred = torch.max(output, 1)   
loss(pred, targets)
print(f"pred output = {pred.dtype}, target={targets.dtype}")

I am getting the following error RuntimeError: “log_softmax_lastdim_kernel_impl” not implemented for ‘Long’

When I checked the data type of “pred” and “targets” and it gives me torch.int64 and torch.float64.

I tried to use target.long() but it is saying RuntimeError: Expected floating point type for target with class probabilities, got Long

When I use pred = pred.to(torch.float32), it works but then I am getting error in loss.backward()

RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

in the loss function .

How to fix this ?

pred represents the indices corresponding to the max. values in output. This argmax operation is not differentiable so pass the logits directly to your loss function if it’s e.g. nn.CrossEntropyLoss.

The targets should then contain either class indices as long or in newer PyTorch versions could alternatively contain probabilities as float to represent soft targets.

Thank you @ptrblck . It solved my issue