Replicate behavior from seeded numpy pcg64 Generator with PyTorch

I have a neuromorphic board which appears to use the same PCG64 Generator used by numpy as part of the dynamics of the neural network. I can replicate the randomness seen on the board with numpy in a forward sense assuming I seed the numpy Generator properly.

I’d like to train a network to run on this board making use of pytorch’s autograd capability, and using the same seeded PCG64 generated sequence. When I use:

gen1 = torch.Generator()
gen1.manual_seed(seed)
torch.rand(1, generator=gen1)

I get different results than if I do:

gen2 = numpy.random.default_rng(seed=seed)
gen2.random(1)

Is there a way to get the same PCG64 based pseudorandom sequence from both numpy and torch?

Thanks!

No, I don’t believe that’s possible as the PRNG implementations might differ internally.
If needed you might be able to create the random values using numpy and transform them to tensors via torch.from_numpy.