Why track_running_stats is not set to False during eval

Why is track_running_stats set to True in eval. This may lead to performance degradation of pretrained model as well and in my opinion should be default behaviour.
And what is recommended method to set track_running_stats False, currently I am just using model.apply(fn)

Infact if such a model with track_running_stats is deployed isn’t it a very vulnerable model. Suppose it is provided to user, he may give gibberish data to screw up running stats.
Something like this happened to me, although instead of gibberish I had a batch size of 1 and running_mean and running_var changed pretty significantly reducing performance of my model somewhat drastically!!!

track_running_stats is used to initialize the running estimates as well as to check if they should be updated in training (line of code).
The running estimates won’t be updated in eval:

bn = nn.BatchNorm2d(3)
for _ in range(10):
    x = torch.randn(10, 3, 24, 24)
    out = bn(x)
print(bn.running_mean)
print(bn.running_var)
> tensor(1.00000e-03 *
       [-0.7753,  0.7027, -1.4181])
  tensor([ 1.0015,  1.0021,  0.9947])

bn.eval()
for _ in range(10):
    x = torch.randn(10, 3, 24, 24)
    out = bn(x)
print(bn.running_mean)
print(bn.running_var)
> tensor(1.00000e-03 *
       [-0.7753,  0.7027, -1.4181])
  tensor([ 1.0015,  1.0021,  0.9947])
3 Likes

Ohhhh sorry my bad, I rechecked my script, I forgot putting model.eval() at one place and that led to this screw up.

Thanks a lot @ptrblck :smiley: :sweat_smile:

Hi, There may be some bugs on track_running_stats=False and bn continous to track the running_mean

What kind of bug did you observe? Did the running_mean update even after calling eval() on the module?

Hi @ptrblck yes it is happening even when i call model.eval() on validating model using same test data , model performance degrade when i use shuffle=True please see detail with code here

Model.eval() giving different result when shuffle is True and False

I’ve answered in the linked thread and cannot reproduce the issue.