How to use dropout correctly in Neural network Pytorch

I have designed my network as follows, i am not sure whether it is right to use Dropout just after relu. I am doing a mulit class image classification task. The images are grayscale and they are 64*64 in size.

class Neural_Network(nn.Module):
def __init__(self, input_node, hidden_node, num_classes):
    super(Neural_Network, self).__init__()
    self.input_node = input_node
    self.layer1 = nn.Linear(input_node, hidden_node) 
    self.relu = nn.ReLU()
    self.layer2 = nn.Linear(hidden_node, hidden_node2)
    self.relu = nn.ReLU()
    self.layer2_drop= nn.Dropout(p=0.5)
    self.layer3 = nn.Linear(hidden_node2, num_classes)


def forward(self, x):
    x = self.layer1(x)
    x = self.relu(x)
    out = self.layer2(x)
    out = self.relu(out)
    out = self.layer2_drop(out)
    out= self.layer3(out)



    return out

Google for “dropout before or after activation” to find a whole bunch of discussion which order might or might not preferable. I don’t think there’s a hard consensus. It’s seem Dropout before the activation function is more common, but I didn’t check in detail.

1 Like