Does PyTorch change its internal seed during training?

As @ybj14 said, the pseudo-random number generator uses the seed as its initial seed and generates all sequential numbers based on this initial seed.
That doesn’t mean that every “random” number will have the exact same value (which would create a useless random number generator), but that the sequence of random numbers are the same.
Have a look at this example:

torch.manual_seed(2809)
print(torch.randn(2))
print(torch.randn(2))
print(torch.randn(2))

torch.manual_seed(2809)
print(torch.randn(2))
print(torch.randn(2))
print(torch.randn(2))

As you can see, torch.randn will yield new random numbers if sequentially called. After resetting the seed, you’ll get the same sequence of random numbers.

In your case, the model initialization can be seen as a call to the random number generator, which will yield different results. If you want your model to have exactly the same parameters, set the seed before initialization or reload the state_dict of a reference model.

4 Likes