I just written a simple model to classify cifar10 like below method:https://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html#sphx-glr-beginner-transfer-learning-tutorial-py
And I ran it twice,with the same seed:
torch.manual_seed(60)
torch.cuda.manual_seed(60)
and I set dataset loader shuffle=False
without any transformer that may include random variable.
Besides, my network parameters are loaded by an existed weight in order to avoid random assignment.
But in training period, with epoch increasing, the differences between the network weights of two executions were more and more obvious (After the first epoch, the differences are just about 1e-9, second 1e-7 and expended Persistently).
Why that happened? Is there any reason like computation error or there are still some random variables ignored? Thanks in advance.