I faced the same problem as the above link on CIFAR10/ResNet. My model uses Batchnorm, and its update changes the result.
But I want to make my result fixed. How can I do that?
The answer from the linked post explains, that the running statistics in batchnorm layers will be updated during training and used during evaluation (model.eval()).
If you want to keep these stats constant, use model.eval() and don’t perform any forward passes while the model is in training mode.