How random is AdamOptimizer

after 5 iteration,the backward results are different with same input

I don’t think Adam adds randomness. Could you describe your issue with more details, please?

I use manual_seed_all to make all randoms be constant in the net, so randoms initialized for each run are the same.
The input dataset and the initial values for the variables of AdamOptimizer are also the same, but i can not align the values ​​include losses 、weights of conv and gradient after 5 iter or 10 iter.
How is it that?

Take a look at the Reproducibility docs which explain how deterministic behavior can be achieved.