Random initialization of neural networks in PyTorch: how is the seed managed?

Dear PyTorch community,
in PyTorch, can you tell me if and how i can know if the weights of a neural net are re-initialized with a random seed at each run, i.e., if I run multiple times my .py script, are the generated weights different (i.e., different realizations of a R.V. of the same distribution (uniform I guess)) i.e. not using the same seed?

The question may seem a bit strange but in Matlab, the default behavior is that the same seed is used, resulting in the same sequence of random numbers (a very bad choice in my opinion, since if I wanted the same seq. for reproducibility I would set it manually… but that’s how it is). I am not sure about PyTorch so I would need confirmation because I want to run my neural net multiple times (same architecture) but see the effect of different random initializations?

Thanks.

You can simply print any weight (or e.g. its .double().abs().sum()) and will see that different seeds are used in each run unless you are explicitly seeding the code.

indeed i see what you mean

Also, alternatively just check the initial seed at the beginning of your script via print(torch.initial_seed()), which should also differ.

2 Likes