Sample outputs using Dropout

Suppose, I have a dropout layer after my fully connected layer:

    def forward(self, x):
        bs = x.size(0)
        x = func.relu(self.conv1(x))
        x = func.relu(self.conv2(x))
        x = func.relu(self.conv3(x))
        x = x.view(bs, -1)
        x = func.relu(self.fc1(x))
        x = func.dropout(x, self.training)
        x = self.fc2(x)
        return x

Theoretically, if I pass same x to this function multiple times, I should get different results as dropout randomly drops 50%. However, by executing the following code, I always get the sample outputs:

        samples = []
        for _ in range(sample_amount):
            x = Variable(torch.from_numpy(states), volatile=True)
            samples.append(self.forward(x).data.cpu().numpy()[0])

Also, are dropout applying to weights or input tensors? It seems to me that dropout currently only applies to the input tensor. Is there a way that I can apply dropout to the weights of fully connected layer?

By default the training arg is False. You need to add training=True.
http://pytorch.org/docs/master/nn.html#torch.nn.functional.dropout

1 Like