Does random seed in pytorch has the effect on the function of drop out?

I know the random seed affect the initialization of the net.
However, I wonder if the random seed has the effect on the randomness of drop out layer?

Yes, dropout layers will also be affected by the random seed, as they are sampling from the PRNG:

x = torch.randn(1, 10)

d11 = F.dropout(x, p=0.5, training=True)
d12 = F.dropout(x, p=0.5, training=True)

d21 = F.dropout(x, p=0.5, training=True)
d22 = F.dropout(x, p=0.5, training=True)


Yeah, I have verified it~Thank you very much!

Does this mean that the exact same nodes will be dropped out in every epoch during model training?

Yes, if you re-seed the code, the same random numbers will be drawn and thus your training would lose the “randomness”, which is why you would usually only set the seed at the beginning of the training, if you want reproducible results (besides the other setup mentioned in the Reproducibility docs).