Finetune a network trained using Batchnorm after removing batchnorm

Is it possible to remove batchnorm layesr from already trined network and fine tune the network trained again?

How could I load the parameters from the old network to the new network?

You would need to add a linear layer for the learned scaling or fuse the scaling into the linear layer before it.

exactly, I was thinking of this, I need more details in how to merge or modify parameters