Train() and eval() for BatchNorm and Dropout

Hi all,

For some purpose, I want to use the eval() mode for BatchNorm layers and train() mode for Dropout layers during training. Does anyone know how to do this? If I simply call model.train() or model.eval(), both of them will be in train mode or evaluation mode.

You can call train/eval on individual (sub-) modules, e.g.

for m in model.modules():
    if isinstance(m, torch.nn.BatchNorm2d):

should do the trick.
Note that the weight and bias of batch norm will still require gradients and be trained. You have to set them to not require gradients if you want to avoid that.

Best regards


Hi Tom,

Thank you so much for helping! I really appreciate it!