Difference about using BatchNorm in two state(Training and Testing)

From naive paper of BatchNorm, we know that BN is for reducing internal Covarite Shift.

I just wonder how BN here runs correctly.

I check the DOC and guess:

Train ----> set Model.train()

Test -----> set Model.eval()

am I right ?

1 Like

Yes, this is correct. The same applies to dropout as well. This changes which statistics BN uses (batch or global).
http://pytorch.org/docs/nn.html#torch.nn.Module.eval