Torch.manual_seed is need to be set everytime

I am using torch.manual_seed to fix the generator but it is giving different values at different runs unless I manually declare it before every run. One of the example run is,

torch.manual_seed(10)
d = torch.randn(4)
e = torch.randn(4)
print(d,e)
torch.manual_seed(10)
e = torch.randn(4)
print(d,e)

The output is,
tensor([-0.6014, -1.0122, -0.3023, -1.2277]) tensor([ 0.9198, -0.3485, -0.8692, -0.9582])
tensor([-0.6014, -1.0122, -0.3023, -1.2277]) tensor([-0.6014, -1.0122, -0.3023, -1.2277])

1 Like

A seed in a random number generator makes sure to return the same sequence of random numbers.
As you can see in your example, the second e gets the same values as d after resetting the seed.

Iā€™m not sure I understand the issue correctly, but would you like to get the exactly same values for each torch.randn call? This would be an unusual use case.

I believe that was the intention inferring from the context of the thread. Additionally, if we replace the above example with numpy we get the same behaviour.

import random
import numpy as np

random.seed(2019)
np.random.seed(2019)
rng = np.random.RandomState(2019)

d = rng.randn(4)
e = rng.randn(4)
c = rng.randn(4)
print(d, e, c)

every call to rand will produce a different result but the running the above code in a script over and over again will yield three tensors with different random values due to the call to randn but those values will be consistent across different runs of the same script.

Yeah, my usecase was to produce the same random numbers at two different parts in a script. I had a wrong understanding of how manual_seed works. Thank You for clearing it out!