Implement dropout layer nn.Sequential

I am trying to implement a Dropout layer using pytorch as follows:

class DropoutLayer(nn.Module):
    def __init__(self, p):
        super().__init__()
        self.p = p

    def forward(self, input):
        if self.training:
            u1 = (np.random.rand(*input.shape)<self.p) / self.p
            u1 *= u1
            return u1
        else:
            input *= self.p

And then calling a simple NN.sequential:

model = nn.Sequential(nn.Linear(input_size,num_classes), DropoutLayer(.7), nn.Flatten())

opt = torch.optim.Adam(model.parameters(), lr=0.005)
train(model, opt, 5) #train(model, optimizer, epochs #)

But I’m getting the following error:

TypeError: flatten() takes at most 1 argument (2 given)

Not sure what I’m doing wrong. Still new to pytorch. Thanks.

During the training your custom dropout layer would only return the scaled drop mask without the input, while during evaluation nothing would be returned.

Could you post the complete stack trace for the flatten error?

Hi the error has been resolved, I simply needed to call nn.Flatten() first on the multidimensional array in order to convert into a 2d array after which dropout is called. Thx.