Where and How to add Dropout in ResNet18

I want to add dropout in Resnet,but don’t know where to add.
Reference to WideResnet , i put drop in the BasicBlock class,and art of my code is:

class BasicBlock(nn.Module):
    expansion = 1

    def __init__(self, inplanes, planes, stride=1, downsample=None,dropRate=0.5):
        super(BasicBlock, self).__init__()
        self.conv1 = conv3x3(inplanes, planes, stride)
        self.bn1 = nn.BatchNorm2d(planes)
        self.relu = nn.ReLU(inplace=True)
        self.conv2 = conv3x3(planes, planes)
        self.bn2 = nn.BatchNorm2d(planes)
        self.downsample = downsample
        self.stride = stride
        self.droprate = dropRate

    def forward(self, x):
        residual = x

        out = self.conv1(x)
        out = self.bn1(out)
        out = self.relu(out)

        out = self.conv2(out)
        out = self.bn2(out)

        if self.downsample is not None:
            residual = self.downsample(x)

        out += residual
        out = self.relu(out)
        **if self.droprate > 0:**
**            out = F.dropout(out, p=self.droprate, training=self.training)**

        return out

But the question is the value of “p” had no impact on the performance,so i I suspect that the way I use dropout is wrong. So my question is where and how to add dropout in ResNet.
Really hope to get help ,thanks!

Hi I’m also interested in the problem. Is there any update on this?

From what I saw it seems most common to place dropout after each relu.
This code goes recursively through each block.

    model = resnet18()

    def append_dropout(model, rate=0.2):
        for name, module in model.named_children():
            if len(list(module.children())) > 0:
            if isinstance(module, nn.ReLU):
                new = nn.Sequential(module, nn.Dropout2d(p=rate, inplace=True))
                setattr(model, name, new)