How to train the Batch Normalization layers in pretrained network

Hello, I have a pretrained network from the model zoo. That network has batch normalization layers that along with all the other layers were pretrained on the ImageNet dataset. I am trying to use the network as a fixed feature extractor for transfer learning and have replaced the classifier with my own classifier. However, I don’t fully understand how to unfreeze only the Batch Norm layers so they could train along with the classifier while the rest of the layers remain fixed. As far as I understand, enabling grad on the Batch Norm layers will enable the gradients on all the remaining layers in the network as all the nodes are dependent on the Batch Norm. Surely, there has to be a way to make the Batch Norm layers to train without unfreezing the rest of the network? Any help is appreciated, thank you.

1 Like

Unfreezing only the batch norm layers should not unfreeze other non-bn layers.