Dear PyTorch community,
in PyTorch, can you tell me if and how i can know if the weights of a neural net are re-initialized with a random seed at each run, i.e., if I run multiple times my .py script, are the generated weights different (i.e., different realizations of a R.V. of the same distribution (uniform I guess)) i.e. not using the same seed?
The question may seem a bit strange but in Matlab, the default behavior is that the same seed is used, resulting in the same sequence of random numbers (a very bad choice in my opinion, since if I wanted the same seq. for reproducibility I would set it manually… but that’s how it is). I am not sure about PyTorch so I would need confirmation because I want to run my neural net multiple times (same architecture) but see the effect of different random initializations?