I’m doing some experiments and I found that even though I fixed starting weight(I used pre-initialized weight), learning_rate and data order(I set shuffle option to False), the gradient changed when I try this two times! I got gradients of my model throughout 5 epochs and I did this for two times. Then I compared those gradients each other. They get further and further as epoch gets bigger.
Is there any other way that gradients can be changed?