Yes, you are right. Have a look at the example code:
torch.manual_seed(2809)
a = torch.empty(10).random_(100)
b = torch.empty(10).random_(100)
print(a)
print(b)
> tensor([ 37., 81., 75., 39., 77., 30., 23., 28., 96., 66.])
> tensor([ 27., 49., 75., 11., 71., 48., 6., 74., 41., 20.])
torch.manual_seed(2809)
a = torch.empty(10).random_(100)
print(a)
> tensor([ 37., 81., 75., 39., 77., 30., 23., 28., 96., 66.])
As you can see, the same 10 “random” numbers were sampled after I reset the seed.
The same goes for randomly sampled values on the GPU:
torch.cuda.manual_seed(2809)
a_cuda = torch.empty(10, device='cuda').random_(100)
b_cuda = torch.empty(10, device='cuda').random_(100)
print(a_cuda)
print(b_cuda)
> tensor([ 85., 23., 41., 10., 21., 6., 84., 88., 13., 17.], device='cuda:0')
> tensor([ 53., 20., 8., 56., 17., 34., 55., 14., 68., 41.], device='cuda:0')
torch.cuda.manual_seed(2809)
a_cuda = torch.empty(10, device='cuda').random_(100)
print(a_cuda)
> tensor([ 85., 23., 41., 10., 21., 6., 84., 88., 13., 17.], device='cuda:0')
If you don’t sample random numbers on the GPU, the manual seed won’t have any effect.
However, I would still recommend setting all the seeds in case you sample on the GPU.