Difference between torch.manual_seed and torch.cuda.manual_seed

I often use torch.manual_seed in my code. And I also set the same seed to numpy and native python’s random.

But I noticed that there is also torch.cuda.manual_seed. I definitely use a single GPU.
So what’s happening if I do not set torch.cuda.manual_seed? For example, torch.randn returns same values without torch.cuda.manual_seed. So I want to know what situations I should use cuda’s manual_seed.

So the following code will be better?

torch.manual_seed(args.seed)
torch.cuda.manual_seed(args.seed)
np.random.seed(args.seed)
random.seed(args.seed)
3 Likes

The cuda manual seed should be set if you want to have reproducible results when using random generation on the gpu, for example if you do torch.cuda.FloatTensor(100).uniform_().

3 Likes

Thank you!
I usually do not write such a code, however, I should call the seed function. :smiley:

Hey but shouldn’t torch.manual_seed take care of both as written in https://pytorch.org/docs/stable/notes/randomness.html

You can use torch.manual_seed() to seed the RNG for all devices (both CPU and CUDA)

5 Likes

Yes, it the behavior was changed some time ago and was most likely different, when @albanD answered in this thread. :wink:

4 Likes

I was just wondering best practice for using seeding. I’m using

torch.manual_seed(args.seed)
torch.cuda.manual_seed(args.seed)
np.random.seed(args.seed)
random.seed(args.seed)

for running experiments on a new loss function, with the changed loss and with standard loss. Is it best to keep using a specific seed value or to vary the seed? I’m thinking some seeds may affect initialisation and therefore get into a better solution, thinking of all you need is a good init…

I’m training two models simultaneously in the same script, so should I have the above seed lines prior to instantiating each model individually to ensure the same initialisation? For a fair comparison of one loss over the other.

Is it wise using a seed for this type of research in general?

To be reproducible you may try all this:

seed_value= 0

# 1. Set `PYTHONHASHSEED` environment variable at a fixed value
import os
os.environ['PYTHONHASHSEED']=str(seed_value)

# 2. Set `python` built-in pseudo-random generator at a fixed value
import random
random.seed(seed_value)

# 3. Set `numpy` pseudo-random generator at a fixed value
import numpy as np
np.random.seed(seed_value)

# 4. Set `pytorch` pseudo-random generator at a fixed value
import torch
torch.manual_seed(seed_value)

1 Like