Change the dropout rate dynamically

I found there are two kinds of method to implement dropout, the first one is by nn.Module and the other is by functional.

Besides, in the functional API, it also supports dropout rate as its parameter. Does it mean we can use functional version in the nn.forward method to change the dropout rate dynamically?

Yes, you can set the drop rate in each call.
Remember to set the training argument to the appropriate value. The default is False.

1 Like

I got it! Thank you :slight_smile: