How to add dropout in pytorch dynamically?

Normally we can create a dropout layer by
self.drop_out = nn.Dropout(p=0.5)

For my case, the dropout probability will depend on the input and will be passed in the input vector x during forward propagation. Can anyone tell me how to achieve it?

You could use the functional API in the forward pass via:

def forward(self, x, p):
    ...
    x = F.dropout(x, p, training=self.training)
    ...

Note that you should pass the self.training attribute to this layer so that this dropout call will be disabled, if you call model.eval().

1 Like

Do i need to specify self.drop_out = nn.Dropout(p=0.5) in the init?
I don’t think its required, right?

Also, I have another problem.
I am passing a batch of size 32. So x has dimension (32, y)
The probability p also has dimension of (32, 1) (p is an array of 32 values)
Now for each x, I want the corresponding indexed probability. Can you please in achieving that?

You don’t need to create the module, if you are using the functional API.
p is a scalar values and will not have one value per sample.