Use runing mean and variance of batch norm layer in train mode

Is there a way to use the runing mean and varaince of batch norm layer during training? I want to finetune a model without using the batch mean and variance, the reason is that there is no easy way to synchronize batch mean and variance between several GPUs, and the minibatch size is so small in each GPU.
Thank you.

1 Like

The model was trained using batch norm, so I wouldn’t expect its performance to stand up without using batch mean and variance. You can, however, set the batch norm layers to run in eval mode and thus use the saved mean and variance.

model.train(True)
bn1.train(False)
bn2.train(False)
etc...

I see. Thank you very much.