Today i use such method to set random seed
The args.seed
is default to 1
torch.manual_seed(args.seed)
if args.cuda:
torch.cuda.manual_seed(args.seed)
But everytime the result is slightly different, why?
I use PyTorch0.4
Today i use such method to set random seed
The args.seed
is default to 1
torch.manual_seed(args.seed)
if args.cuda:
torch.cuda.manual_seed(args.seed)
But everytime the result is slightly different, why?
I use PyTorch0.4
Are you using the GPU with cuDNN?
If so, you could set the cuDNN behavior to be deterministic, which would unfortunately trade performance for determinism.
torch.backends.cudnn.deterministic = True
Also, are you using some other libraries sampling random numbers?
If so, you should also seed them.
No, i use none of them.
If i haven’t use
import cudnn
means no use of the module of cuDNN,i only use torch and torchvision
Does Xavier Initialization
matter?I use such way to init my network.
The initializations should yield the same random numbers, if the seed was set.
To see, if you are using cuDNN, use print(torch.backends.cudnn.enabled)
.
Why can’t you write meaningful titles? It makes everyone’s life harder. Please write meaningful titles.
related: