Confusion about dropout2d

I am a beginner studying mnist example.
I find both torch.nn module and torch.nn.functional has dropout and dropout2d. What’s the difference between them?
Besides, I used F.dropout2d instead of nn.dropout2d class to train the network, and the training parameter was not set in F.droput() of the fc layer, but my network still works. I am confused.
My code is as follow:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
        self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
        self.fc1 = nn.Linear(320, 50)
        self.fc2 = nn.Linear(50, 10)

    def forward(self, x):
        x = F.relu(F.max_pool2d(self.conv1(x), 2))
        x = F.relu(F.max_pool2d(F.dropout2d(self.conv2(x)), 2))
        x = x.view(-1, 320)
        x = F.relu(self.fc1(x))
        x = F.dropout(x)
        x = F.log_softmax(self.fc2(x))
        return x

The original code:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
        self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
        self.conv2_drop = nn.Dropout2d()
        self.fc1 = nn.Linear(320, 50)
        self.fc2 = nn.Linear(50, 10)

    def forward(self, x):
        x = F.relu(F.max_pool2d(self.conv1(x), 2))
        x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
        x = x.view(-1, 320)
        x = F.relu(self.fc1(x))
        x = F.dropout(x, training=self.training)
        x = self.fc2(x)
        return F.log_softmax(x, dim=1)

The main difference is that Dropout() works on input tensors of any shape, but Dropout2d is a spatial-dropout designed for 4-D tensors such as images or feature maps from convolution layers. In such cases, the adjacent features might be strongly correlated, therefore, standard dropout will not be able to effectively regularize the network. Dropout2d() or also named SpatialDropout is then designed to ensure that adjacent pixels are either all 0s or they are all active. You can read more about it at this arXiv paper https://arxiv.org/abs/1411.4280

2 Likes

Regarding your second issue:
If you are using the functional API (F.dropout), you have to set the training flag yourself as shown in your second example.
It might be a bit easier to initialize dropout as a module in __init__ and use it as such in forward, as shown with self.conv2_drop. This module will be automatically set to train and eval respectively if you call it on your model.

5 Likes