``````[docs]@weak_script
def dropout(input, p=0.5, training=True, inplace=False):
# type: (Tensor, float, bool, bool) -> Tensor
r"""
During training, randomly zeroes some of the elements of the input
tensor with probability :attr:`p` using samples from a Bernoulli
distribution.

See :class:`~torch.nn.Dropout` for details.

Args:
p: probability of an element to be zeroed. Default: 0.5
training: apply dropout if is ``True``. Default: ``True``
inplace: If set to ``True``, will do this operation in-place. Default: ``False``
"""
if p < 0. or p > 1.:
raise ValueError("dropout probability has to be between 0 and 1, "
"but got {}".format(p))
return (_VF.dropout_(input, p, training)
if inplace
else _VF.dropout(input, p, training))
``````

Where is the code for this function? In other words where is `_VF` object?

I ask because I wanted to confirm if this function is aware of training or not training stage?

You can find the implementation here.

A quick code sample also shows that the behavior is changed in training/eval:

``````x = torch.randn(1, 10)
print(F.dropout(x, p=0.5, training=True))
>tensor([[-0.0000,  1.2927, -0.0000,  0.0000,  1.2973, -0.0000, -1.5645, -0.0000,
-0.0000, -0.0000]])
print(F.dropout(x, p=0.5, training=False))
>tensor([[-0.4785,  0.6463, -0.4714,  1.4728,  0.6487, -2.0346, -0.7822, -1.0754,
-0.7367, -0.8063]])
``````

I am still in shock:

``````
import torch.nn.functional as F

x = torch.randn(1, 10)
print(F.dropout(x, p=0.5, training=True))

print(F.dropout(x, p=0.5, training=False))

# tensor([[ 0.0000, -2.0286, -2.6529,  0.0000,  0.6054,  0.0000, -0.0000, -0.5667,
#           0.3148,  1.6862]])
# tensor([[ 0.2737, -1.0143, -1.3264,  0.0609,  0.3027,  0.0464, -1.5175, -0.2833,
#           0.1574,  0.8431]])
``````

Apparently on my end I don’t get 5/10 zeros, but sometime 1/10 (read: just one zero). Is this strange or OK. Here I showed 4 zero output.

It’s a random operation (thus you have to specify the probability), so you can’t expect to get strictly 5/10 zeros.

Nop, (no problem), thanks for the dropout.