Slim difference in outputs of a model for same seed/inputs


Is it normal that if I run the same model twice after couple of backprop with same inputs on cpu (i.,e. same seeds, etc), the outputs of each of run are slightly different like at fourth digits after the decimal point (e.g. 2.13971 vs 2.13968) ?
[Pytorch version is 1.0.1.post2 and Python 3.7.1, [GCC 7.3.0] : Anaconda custom (64-bit) on linux]


Aren’t you training your model during backprop, resulting in different weight values. Or are you referring to something else.

Check this post:

Set number of thread will do the trick: torch.set_num_threads(1)