The use of torch::sigmoid_

I am doing a classification job for MNIST , the last Net of my model is torch::log_softmax(x, /*dim=*/1) , now i wanna convert the output tensor to a probability vector , so i use auto outsig=torch::sigmoid_(output); , but i got a bad outsig tensor :

1.6201e-08 5.1166e-06 2.5344e-06 2.8637e-03 2.4327e-02 1.9071e-04 5.6446e-08 1.2600e-04 1.9227e-03 4.9237e-01

and my ouput tensor is:

-17.9382 -12.1830 -12.8856 -5.8528 -3.6915 -8.5646 -16.6900 -8.9791

it’s not very inappropriate,the effect I expect is this:

Can someone tell me what to do?THANKSS!!!

If you have used torch::log_softmax in the final layer, I would assume that it’s a multi-class classification problem. The suitable function for getting probability will be torch::softmax() instead of torch::sigmoid().
Can you try it (torch::softmax())?

Thank you very much!You are right, choosing to use sigmoid is stupid. My last layer used logsoftmax , so i should use exp(output) to get the probability vector whose sum is 1 .

Actually , The results of the two outputs sigmoid(output) and exp(output) are the same , I think this is because e^x is too small and approaches 0 .

I think that you should be using torch::softmax() instead of torch::exp().
Softmax normalizes each exp(output_i).
i.e., exp(output_i) / (exp(output_0) + ... + exp(output_n)).