nn.Softmax returns very small value

I’m trying to replicate the GNNExplainer.
When evaluating a trained GNN model inside the GNNExplainer with a subset of the original training dataset, some the of node returns prediction as shown below. As i try to do a softmax on it, it returns a list of very small values, which I won’t be able to use to calculate the loss…

model_ypred: tensor([-2.0499, -0.7565, 5.0231, -1.4513], grad_fn=)
model_ypred(softmax): tensor([8.4303e-04, 3.0732e-03, 9.9455e-01, 1.5340e-03],
grad_fn=)

Sorry I’m very new to GNNs and softmax function… I would appreciate if anyone can give me some advice on the softmax function. Or are the prediction values returned incorrectly?

Hello Sean!

These Softmax values are correct.

Softmax converts raw-score logits that run from -inf to inf
into probabilities that run from 0 to 1. Note that your
“model_ypred(softmax)” sum to 1 as proper probabilities should.

The [2] element of “model_ypred(softmax)” is 99.46%, very nearly
100%, and corresponds to the largest (and only non-negative) value
in “model_ypred: tensor”.

Whether the results of Softmax should be understood as “large” or
“small” depends on what you do with them.

The specific formula is here: Softmax

Best.

K. Frank

AHHH… There’s actually no error… 9.9455e-01 is 99.45% which is a the correct value.
My mistake. Thank you Frank.