Cannot freeze batch normalization parameters

No, sorry for the misunderstanding.
affine will just be considered during the instantiation of the model.
If the nn.BatchNorm layers were already created using affine=True, both parameters will be in the model, and you should treat them as other parameters, i.e. setting requires_grad=False if you don’t want to train them further.

1 Like