Using dropout as a function

Is there any way to use dropout as a function in the forward method? I want to be able to tweak the amount of dropout during training, so something like this would be great:

   layer  = self.fc1(prediction)
   prediction = F.relu(layer)
   prediction = dropout(prediction, prob = parameter) 

where parameter can be learned during training. This would be easy if there was an already written dropout function. With the nn.Dropout, you need to instantiate it with a certain probability, and that remains fixed during training.

Hi,

The dropout function already exists in the functional interface of nn (at least in master, not sure for older releases) as dropout(input, p=0.5, training=False, inplace=False).
That being said, the dropout function is not différentiable wrt the p parameter, so you won’t get a gradients for this, just for input.

Thanks, so if I obtain the p parameter through information from other layers, this should work?

Also, do I need to specify training=True inside the Forward method, or does that automatically get passed along during training?

input = Variable(...)

p = some_net(input)
x = some_other_net(input)

out = F.dropout(x, p.data[0])

out.backward()

In the above example, no gradient will flow back in some_net so you won’t be able to train it.
Even if the interface was taking a Variable as input for p, it would not work because the expression d(out)/d(p) does not exists and so no gradient can be computed for this input.