Dropout2d (channel dropout) but for 1D input?

I like the idea of trying channel dropout, what PyTorch refers to as Dropout2d, to drop entire filters from my convnet. But I am working with 1D signal, not 2D images. The shape required by Dropout2d is (N,C,H,W) or (C, H, W), which is just a wrapper that calls into F.Dropout2d.

This isn’t compatible with my input, which is (N, C, W) or (C, W). Am I missing anything in PyTorch that would enable channel dropout for 1D convnets?

i don’t know about existing one.
but you can unsqueeze input and use the same module.

d = nn.Dropout2d()
out = d(in.unsqueeze(-1)).squeeze()
1 Like

It looks like this works. Because of the unsqueeze/squeeze steps, this would normally require a bunch of entries in the forward() function, from what I can tell. One workaround is below: I define a module and that can be used in the init() function of other modules. There is probably a more elegant solution, and if so I’d like to learn it.

class ChannelDropout1d(nn.Module):
    def __init__(self, *args):
        super().__init__()
        self.d = nn.Dropout2d()
    
    def forward(self, x):
        out = x
        out = out.unsqueeze(-1)
        out = self.d(out)
        out = out.squeeze()
        
        return out

A slight change - most important part is specifying the squeeze dimension, otherwise this plays badly with batch sizes of 1. I also added kwargs so that p (dropout probability) can be specified.

class ChannelDropout1d(nn.Module):
    def __init__(self, **kwargs):
        super().__init__()
        self.d = nn.Dropout2d(**kwargs)
    
    def forward(self, x):
        out = x
        out = out.unsqueeze(-1)
        out = self.d(out)
        out = out.squeeze(-1)
        
        return out
1 Like

Please see pytorch’s source code. They have provided the 1d version of dropout. Link: torch.nn.modules.dropout — PyTorch 1.13 documentation

Nice, thanks! Based on looking at prior versions of the docs, it looks like it got introduced in 1.12.