Why we skip initialize running mean and running var while using pretrained resnet50?

Yeah would definitely check out,And just a little to ask:The situation is I would like to freeze the pretrained model,as the link below has already showed how you can completely freeze bn layer while training:
freeze bn

But i what’s confuses me is these lines of codes:

if freeze_bn_affine:
    '''Freezing Weight/Bias of BatchNorm2D'''
    m.weight.requires_grad = False
    m.bias.requires_grad = False

So since I have already set the bn layers to eval() stage,why do I still need this?
And after this should I need to set the affine= False manually ?