Is dropout in pytorch a pseudo-random operation?

Is dropout in pytorch a pseudo-random operation?
For example, ‘out’ is a tensor with [n,c,h,w]

out=self.lrelu(self.conv_1(out))        
out1 = F.dropout(out, 0.1, training=True)
out2 = F.dropout(out, 0.1, training=True)
out3 = F.dropout(out, 0.1, training=True)

print(out1)
print(out2)
print(out3)

When I run the above code many times. As I wish, out1 out2 out3 are not the same every time.
But out1 in the first time is the same as out1 in the second time. out2 and out3 have the same phenomenon.
Is this correct?
Thank you

F.dropout will randomly sample the mask in each call using the internal pseudo random number generator. If you seed the code, you would thus get the same random numbers and the result would be equal for different runs of the script.

Thank you very much~ :grinning:

Hi, I wonder how does the pseudo-random number generator work in pytorch. As in torch.rand — PyTorch 1.9.0 documentation, is there any explicit cpp code in which we can determine its mechanism?
Thank you very much.

On the GPU curand should be used and a Philox PRNG. I’m unsure, how random numbers are created on the CPU, but it seems that on Linux systems /dev/urandom might be used.
CPU reference, CUDA reference