Random seed that spans across devices

Sorry for throwing in multiple questions:

Does pytorch support setting same seed across all devices (CPUs/GPUs)? Do these devices all use the same PRNG? If not, are there any other ways to make layers like dropouts deterministic?

torch.manual_seed(seed) will set the same seed on all devices.

However, setting the same seed on CPU and on GPU doesn’t mean they spew the same random number sequences – they both have fairly different PRNGs (for efficiency reasons).

As guidance, treat the PyTorch PRNGs as determinism helpers, but not so much as having PRNG consistency across device types.

1 Like

Can I get deterministic results for layers like dropout just on GPUs with Pytorch?

yes, if you fix the seed, dropout will be deterministic in pytorch, even on GPUs

How do I go about using the same PRNG on different PyTorch devices? Can I switch the backend for different layers? Also, what sort of backends are available in PyTorch for nn layers? CUDNN, MKL, torch (CPU) and torch (GPU)?

1 Like