Even with the same seed, different random numbers are generated by different devices?

import torch
cuda = torch.device("cuda")
torch.manual_seed(10)
torch.randn(3)
# always yields tensor([-0.6014, -1.0122, -0.3023])
torch.manual_seed(10)
torch.randn(3, device=cuda)
# always yields tensor([-0.1029,  1.6810, -0.2708], device='cuda:0')

As you can see, the random numbers yielded are different depending on the device used, even though I use the same seed for them. Isn’t this a problem for reproducibility? Depending on whether the machine is using a particular device, the random numbers will be different. It should be the same across any set of devices, right?

No, the pseudorandom number generator implementations might differ on different devices and if different software stacks are used, so the reproducibility (assuming you are following these docs) should be visible by rerunning the same script on a specific system.