Does random seed in pytorch has the effect on the function of drop out?

I know the random seed affect the initialization of the net.
However, I wonder if the random seed has the effect on the randomness of drop out layer?

Yes, dropout layers will also be affected by the random seed, as they are sampling from the PRNG:

x = torch.randn(1, 10)

torch.manual_seed(2809)
d11 = F.dropout(x, p=0.5, training=True)
d12 = F.dropout(x, p=0.5, training=True)

torch.manual_seed(2809)
d21 = F.dropout(x, p=0.5, training=True)
d22 = F.dropout(x, p=0.5, training=True)

print((d11==d21).all())
print((d12==d22).all())
3 Likes

Yeah, I have verified it~Thank you very much!

Does this mean that the exact same nodes will be dropped out in every epoch during model training?

Yes, if you re-seed the code, the same random numbers will be drawn and thus your training would lose the “randomness”, which is why you would usually only set the seed at the beginning of the training, if you want reproducible results (besides the other setup mentioned in the Reproducibility docs).

1 Like