When I call dropout it is not zero-ing out any of my datapoints. I have tried the layer and functional formats. I am using PyTorch 1.3.0. Here is sample code and output:
import platform; print(f"Platform {platform.platform()}")
import sys; print(f"Python {sys.version}")
import torch; print(f"PyTorch {torch.__version__}")
import torch.nn as nn
class Net(nn.Module):
def __init__(self):
super().__init__()
self.do = nn.Dropout(p=.5)
def forward(self, x):
return self.do(x)
net = Net()
data = torch.Tensor([1., 2., 3., 4., 5., 6.]).view(3, 1, 2)
print(data)
net.train()
print(net(data))
So under ideal conditions there is a 1/64 chance to have a p=0.5 dropout not set anything to 0. You might just have gotten lucky and when you re-run it, you should get zeros, too.