Softmax assigns equal values to all classes

Hi, I am new to PyTroch. I am facing an issue where when I apply softmax to predicted probabilities, all the classes are assigned the same probability. For example for a 9 class problem, the output for each class is 0.111111. This results in a constant Cross entropy loss, no matter what the input is.
Although when I take argmax of these same probabilities, the predicted classes are correct and the network has more than 90% accuracy. The individual predicted probability values are very small.

Hello Hamza!

This sounds odd and is well worth investigating.

Could you run your system and print out the the values before they go
into softmax()? Then make a short, self-contained, runnable script
that uses those values to illustrate your issue.

I have in mind something like this:

import torch
print (torch.__version__)

my_values = torch.FloatTensor (<my_nine_values_that_I_printed_out>)
my_min, my_argmin = my_values.min (dim = 0)
print  (my_min, my_argmin)
my_max, my_argmax = my_values.max (dim = 0)
print  (my_max, my_argmax)
print  (my_max - my_min)

my_values = torch.nn.functional.softmax (my_values, dim = 0)
my_min, my_argmin = my_values.min (dim = 0)
print  (my_min, my_argmin)
my_max, my_argmax = my_values.max (dim = 0)
print  (my_max, my_argmax)
print  (my_max - my_min)

Best.

K. Frank

Thanks K. Frank, actually the problem was with the default floating point precision. When I print the values using .item() they are indeed different but very minutely. The predicted probability values are also very small.