Is it possible? F.softmax returns slightly different outputs even though inputs are same.
And if each tensor of input1 and input2 are different with almost 0.00001 then will softmax be different quitely?
Is it possible? F.softmax returns slightly different outputs even though inputs are same.
And if each tensor of input1 and input2 are different with almost 0.00001 then will softmax be different quitely?
You might see rel. small errors due to the limited floating point precision, if the used algorithm is non-deterministic.
Could you set torch.set_deterministic(True)
and rerun the script? Also, which PyTorch version are you using and CPU/GPU for this test?