Is there a RandomState equivalent in PyTorch for "local" random generator seeding?

As far as I’m aware, if I use torch.manual_seed(12), that will seed the PyTorch random number generator globally, so every time I use any Torch function involving random number generation, it’ll be automatically use the set seed. Is it possible to have seeding for one part of my PyTorch code, but then remove it and allow varying random generation again in a different part of the code? So, for example, I can generate the same array each time with a set seed, but then in a later for-loop, I can generate different random values for each iteration of the loop by not using the seed. I can accomplish this in NumPy by using RandomState to make a local random generator where I want to set a seed, and keep the global random generation without a seed.

5 Likes

Just came here to ask exactly this question. This’d be really nice to have. As far as I can tell from the torch.rand documentation there’s no possibility of passing a RandomState object as an argument. Found this but that seems to be just for the Lua version. However gen=torch.Generator() seems to work, I just don’t know what to do with it.

You could pass your torch.Generator manually to the random function.
I’m not sure, if this API will change in the future, as this argument is not documented as far as I know and maybe shouldn’t be exposed in the current implementation.

However, I think this code should work:

gen0 = torch.Generator()
gen1 = torch.Generator()

gen0 = gen0.manual_seed(0)
gen1 = gen1.manual_seed(1)
torch.rand(5, generator=gen0)
torch.rand(5, generator=gen0)
torch.rand(5, generator=gen1)
torch.rand(5, generator=gen1)

gen0 = gen0.manual_seed(0)
gen1 = gen1.manual_seed(1)
torch.rand(5, generator=gen1)
torch.rand(5, generator=gen1)
torch.rand(5, generator=gen0)
torch.rand(5, generator=gen0)
7 Likes

oooh nice that should be documented. Though really it would be nicest if like in numpy you could do rng=torch.Generator(seed=1234); data=rng.randn(3, 4)

2 Likes

As I’ve seen it in other frameworks, maybe you could create a feature request in Github so that we could discuss this potential feature there? :slight_smile:

Good idea, created an issue here

2 Likes

Just got caught up with all the activity! Thanks for the torch.Generator solution, @ptrblck! And thanks @petered for posting that GitHub feature request, hopefully the devs are able to implement something that matches NumPy. But I’m glad something exists for now for local seeding.

1 Like