Log_softmax_forward is not implemented for type torch.cuda.LongTensor?

when i use CrossEntropyLoss ,and convert the target to tensor.cuda.LongTensor.it shows that ‘log_softmax_forward is not implemented for type torch.cuda.LongTensor’.So,what should i do?

How about trying

loss = loss_function(output,target.long())

NOTE:- CrossEntropyLoss(input, target) takes LongTensor targets and FloatTensor inputs

so make sure your output are float tensors

1 Like

thanks,but when i have convered the output to floart.and it shows that ‘RuntimeError: grad can be implicitly created only for scalar outputs’.

Are you using Multiple GPU and using torch.nn.DataParallel

Similar problem occured saying
RuntimeError: log_softmax_forward is not implemented for type torch.LongTensor

When using nn.CrossEntropyLoss() ( But works in MSELoss)

Before i was getting

RuntimeError: Expected object of type torch.LongTensor but found type torch.FloatTensor for argument #2 'target’

I’ve changed the tensor type to Longtensor by using .long() for both input and label images in criterion as
loss = criterion(outputs.long(), images.long())

Now getting the this error.I’ve changed both to other type and different type each giving different errors.

Please help!!.

For a vanilla multi-class classification nn.CrossEntropyLoss expects the model outputs to be a FloatTensor in the shape [batch_size, nb_classes], while the target should be a LongTensor in the shape [batch_size] containing class indices in the range [0, nb_classes-1].

2 Likes

I dont understand what you are saying but, I got got my solution. Actually what i was doing was using sigmoid activation at output and using nn.CrossEntropyLoss didnt worked but BCELoss worked. Also found that for using nn.CrossEntropyLoss , we have to explicitly use softmax activation. Is i am right?

No, you should pass the raw logits to nn.CrossEntropyLoss, as internally F.log_softmax and nn.NLLLoss will be used.

1 Like

Thank you :slight_smile: .getting over it.