Weird behavior of torch seed

I don’t think that’s the case and the seed would still be the same you’ve set.
However,

explains that you are calling into the pseudo-random number generator (PRNG), which will thus create the divergence.
Once seeded the PRNG will create the same sequence of random numbers for the same order of operations.
If you break this assumption and now add more random calls to your script the order/sequence of operation changes and you will not get the same values anymore.

Here is a small example:

torch.manual_seed(2809)
print(torch.randn(3))
# tensor([-2.0748,  0.8152, -1.1281])
print(torch.randn(3))
# tensor([ 0.8386, -0.4471, -0.5538])
print(torch.randn(3))
# tensor([-0.8776, -0.5635,  0.5434])

torch.manual_seed(2809)
print(torch.randn(3))
# tensor([-2.0748,  0.8152, -1.1281])
print(torch.randn(3))
# tensor([ 0.8386, -0.4471, -0.5538])
print(torch.randn(3))
# tensor([-0.8776, -0.5635,  0.5434])

torch.manual_seed(2809)
print(torch.randn(3))
# tensor([-2.0748,  0.8152, -1.1281])

# add new call into PRNG by initializing a new layer
layer = nn.Linear(1, 1)

print(torch.randn(3))
# tensor([0.8386, 0.2195, 1.3188]) # !!!
print(torch.randn(3))
# tensor([-1.6841,  0.9325, -1.8085]) # !!!
1 Like