Softmax calculation is slightly off

I am using cross entropy loss on two computers and one of them is returning different numbers. I believe I have tracked it down to one of the computers returning slightly wrong values for softmax. Any reason why this would be? The computer which is wrong, with numbers displayed here, is running pytorch 1.3.1.
(Pdb) (X[0].exp() / X[0].exp().sum()).tolist()
[0.09599629789590836, 0.06680507212877274, 0.09601865708827972, 0.06676748394966125, 0.12830372154712677, 0.06672990322113037, 0.13807427883148193, 0.06669234484434128, 0.16462643444538116, 0.1099857762455940
2]
(Pdb) F.softmax(X[0]).tolist()
[0.09599630534648895, 0.06680507957935333, 0.09601867198944092, 0.06676748394966125, 0.12830372154712677, 0.06672991067171097, 0.13807427883148193, 0.06669234484434128, 0.16462644934654236, 0.1099857836961746
2]
(Pdb) X[0].softmax(0).tolist()
[0.09599630534648895, 0.06680507957935333, 0.09601867198944092, 0.06676748394966125, 0.12830372154712677, 0.06672991067171097, 0.13807427883148193, 0.06669234484434128, 0.16462644934654236, 0.1099857836961746
2]

Hi py!

This is floating-point round-off error. First cast X to a
dtype = torch.double tensor, and then redo your test.
Your “right” numbers and “wrong” numbers will now agree
to about twice as many decimal places.

Best.

K. Frank

1 Like