The Random Seed

Today i use such method to set random seed
The args.seed is default to 1

if args.cuda:

But everytime the result is slightly different, why?
I use PyTorch0.4

1 Like

Are you using the GPU with cuDNN?
If so, you could set the cuDNN behavior to be deterministic, which would unfortunately trade performance for determinism.

torch.backends.cudnn.deterministic = True

Also, are you using some other libraries sampling random numbers?
If so, you should also seed them.


No, i use none of them.
If i haven’t use

import cudnn

means no use of the module of cuDNN,i only use torch and torchvision

Does Xavier Initialization matter?I use such way to init my network.

The initializations should yield the same random numbers, if the seed was set.
To see, if you are using cuDNN, use print(torch.backends.cudnn.enabled).

Why can’t you write meaningful titles? It makes everyone’s life harder. Please write meaningful titles.