Turn off batch-norm but leave dropout on

Hi all,

I want to play a bit with Monte Carlo dropout. However I am using ResNets (yes, I know that they don’t use dropout but added it). The problem is that ResNets also use batch normalization.

Now during the inference stage, I need to do multiple forward steps for each image and average results, while keeping dropout in train mode. At the same time, because batch norm behaves differently during train and eval mode, I need to turn it off. Simply doing:

net = net.eval()

obviously doesn’t work and sets both dropout and batch norm in eval mode.

Any solutions (I guess it is something relatively straightforward)?

This should work:

for m in model.modules():
  if isinstance(m, nn.BatchNorm2d):
    m.eval()
5 Likes

Worked like a charm! Thanks :slight_smile:

hi @Ismail_Elezi, this doesn’t actually turn off batch norm right? I am looking for a way to train with batch_norm but on inference, I just want to use a batch of one and so I want to turn batchnorm off completely. Do you have a suggestion? @vabh

Hello,
Could you explain what you mean by ‘turn off’ batch norm?

If you set BN to eval mode, it is possible to use a batch size of 1 during inference. This is because during testing the stored running mean and running std are used. These values are not calculated from the batch.

If you set BN to eval mode it will use the running mean and running var parameters to normalize each batch - right? But on inference, if I want to use a batch of one, then isn’t it going to normalize activations based on these batch statistics which might change my classifier’s results? If my understanding is wrong, please let me know :slight_smile: