I am doing a classification job for MNIST , the last Net of my model is torch::log_softmax(x, /*dim=*/1) , now i wanna convert the output tensor to a probability vector , so i use auto outsig=torch::sigmoid_(output); , but i got a bad outsig tensor :
If you have used torch::log_softmax in the final layer, I would assume that it’s a multi-class classification problem. The suitable function for getting probability will be torch::softmax() instead of torch::sigmoid().
Can you try it (torch::softmax())?
Thank you very much!You are right, choosing to use sigmoid is stupid. My last layer used logsoftmax , so i should use exp(output) to get the probability vector whose sum is 1 .
Actually , The results of the two outputs sigmoid(output) and exp(output) are the same , I think this is because e^x is too small and approaches 0 .
I think that you should be using torch::softmax() instead of torch::exp().
Softmax normalizes each exp(output_i).
i.e., exp(output_i) / (exp(output_0) + ... + exp(output_n)).