Using nn.CrossEntropyLoss(), how can I get softmax output?

As I know, nn.CrossEntropyLoss() automatically apply logSoftmax using FC layer output.
So then, how can I get logsoftmax/softmax output?

Thank you.

The outputs would be the featurized data, you could simply apply a softmax layer to the output of a forward pass. Something like:

model = nn.Sequential(...)
probs = nn.Softmax(dim=1)

outputs = model(input)
probs(outputs)

Yeah that’s one way to get softmax output.
But there is problem.
I want to use this loss function.
criterion = nn.CrossEntropyLoss().cuda()

outputs = model(input)
softmax_output = probs(outputs)
loss = criterion(softmax_output , labels)  ?? 

Then the loss is like nn.Softmax(nn.logsoftmax(outputs)), right?
Because nn.CrossEntropyLoss() will apply logsoftmax itself.
What should I do?

Thank you.

From the official docs https://pytorch.org/docs/stable/nn.html:
nn.CrossEntropyLoss() combines nn.LogSoftmax() and nn.NLLLoss() together

Thank you.
But the link is dead.
Can I seperate nn.CrossEntropyLoss()'s nn.LogSoftmax() or nn.NLLLoss() ?

Code says more than words :wink:

probs = nn.Softmax(dim=1) # Or logsoftmax
criterion = nn.CrossEntropyLoss() 

outputs = model(inputs)
softmax_output = probs(outputs)
loss = criterion(outputs , labels)
3 Likes

Thank you for your help ^^.

1 Like