Softmax Output Values Do Not Sum to 1 in PyTorch

I’m encountering an issue when applying the softmax function to a tensor in PyTorch, where the returned values do not sum to 1 as expected. Below is the code I am using:

import torch

torch.set_printoptions(precision=12, sci_mode=False)

a = torch.tensor([15.5438404083251953125000000, -7.4692978858947753906250000, -7.7074594497680664062500000])

soft = torch.nn.functional.softmax(a, dim=0)
# Softmax output
print(soft)
# Output:
# tensor([1.000000000000, 0.000000000101, 0.000000000080])

The softmax output gives a tensor where the values are very close to 0 and 1, but they do not sum exactly to 1. I understand this may be due to floating-point precision issues, but I’m not sure how to resolve this.

  1. How can I ensure the softmax output values sum to 1?
  2. Is there a more stable or accurate way to apply softmax in this context to avoid precision issues?

Hello!
The issue with softmax not summing to 1 is due to floating-point precision. To resolve this, convert your tensor to double precision before applying softmax, use log_softmax for better stability, or manually normalize the softmax output by dividing it by its sum. These methods should help achieve more accurate results.
Best Regards,
Mary Kay InTouch