But I still cannot understand why there should exist such code. Where are these generated random numbers used? It seems that “opt.manualSeed” has never been called later on.
You just need to call torch.manual_seed(seed), and it will set the seed of the random number generator to a fixed value, so that when you call for example torch.rand(2), the results will be reproducible.
An example
Hey @fmassa, in the above codes, what’s the differences between random.seed(opt.manualSeed) and torch.manual_seed(opt.manualSeed)? It seems the both of the two are produce seeds. Are they repetitive? Thanks!
Yes, I got this point. Can the codes use torch.manual_seed(opt.manualSeed) directly and merely? What is the significance of random.seed(opt.manualSeed)? Is it redundant? @smth
No.
The random values will be still “random” but in a defined order.
I.e. if you restart your script, the same random numbers will be created.
Have a look at PRNG for more information.
Thus, to seed everything, on the assumption one is using PyTorch and Numpy:
# use_cuda = torch.cuda.is_available()
# ...
def random_seeding(seed_value, use_cuda):
numpy.random.seed(seed_value) # cpu vars
torch.manual_seed(seed_value) # cpu vars
if use_cuda: torch.cuda.manual_seed_all(seed_value) # gpu vars
Question about this - I have about 10 files all of which have import torch included. Do I need to call torch.manual_seed(0) on all of them? Or only on the first.