Will eval mode disable this dropout?

In my descendent of torch.nn.Module this is the start of my forward function:

def forward(self, x):
            dropout_out = self.dropout(x.float())
            layer1_out = self.linear1(dropout_out)

Will self.dropout automatically do nothing if self.eval() has been called?

Assuming that self.dropout is torch.nn.Dropout, then your dropout_out variable will be equal to x.float() in evaluation mode. During evaluation the dropout function is an identity function.

If you want to apply dropout during evaluation (e.g. to implement Monte Carlo Dropout or similar), then you can use the torch.nn.functional.dropout function instead of the module. The functional form has a boolean training parameter that allows you to explicitly control whether to operate in training or evaluation mode.

1 Like

it is torch.nn.Dropout. Thanks Thomas.

1 Like