# Dropout weird behavior

why does it drop sometimes more, sometimes less than 3 elements in a 2x5 matrix, when I have given probability 0.3
y = torch.ones(2, 5)
x = nn.Dropout(0.3)
x(y)
if got this output
tensor([[1.4286, 1.4286, 1.4286, 0.0000, 0.0000],
[1.4286, 1.4286, 1.4286, 1.4286, 1.4286]])

on later execution, this output
tensor([[0.0000, 0.0000, 1.4286, 1.4286, 0.0000],
[1.4286, 0.0000, 0.0000, 1.4286, 1.4286]])

Hi,

Each unit is dropped with a probability 0.3. In your case, you have 10 values. You may expect exactly 3 nodes to be dropped every time. It does not need to be exactly 3. Sometimes, it might be 2. Sometimes it can be 4 or even 5. I can one intuitive example.

Suppose you have an unbiased coin. P(H)=P(T)=0.5 (Here H-Head, T-Tail). If you toss the coin 10 times, on average you get 5 Heads and 5 Tails. But it can be 4 Heads and 6 Tails. The probability enforces that the expected number of heads in 10 tosses is 5.

Thanks
Regards
Pranavan

so what do I need to do to drop exactly 3 of 10 elements, i.e. I want to drop exactly 3 elements every time?
not more, not less.

In neural network, the randomness that we have is based on the probability. As far as I know, we do not have any modules which can enforce an exact behaviour. If you want the exact behaviour, you can select 3 indices at random and you can make it as zero. In that case, you have to write custom layers or modules in pytorch which extend the class `nn.Module`. Then you can plug it as a layer, in your code body.

Thanks