Dropout during inference

I would like to enable dropout during inference. So, I am creating the dropout layer as follows:

self.monte_carlo_layer = None
if monte_carlo_dropout:
    dropout_class = getattr(nn, 'Dropout{}d'.format(dimensions))
    self.monte_carlo_layer = dropout_class(p=monte_carlo_dropout)

And I invoke it in the forward function as:

def forward(self, x):
   if self.monte_carlo_layer is not None:
       x = self.monte_carlo_layer(x)

My question is will this ensure that the dropout will be invoked even during testing i.e. with eval activated?

Iā€™m not sure how the posted code is used, but would recommend to explicitly set train() on the dropout module via:

model.eval() # sets all layers to eval
model.drop_layer.train() # resets dropout to train