Replace intermediate layers in a model

Hi, i am trying to replace batchnorm layer with syncbatch norm layer in pretrained network( of type nn.Sequential), and i am using the code below:

for k, m in enumerate(Net):
                if type(m).__name__ == 'BatchNorm2d': 
                    planes = m.num_features 
                    eps = m.eps 
                    momentum = m.momentum 
                    affine = m.affine

                   Net._modules[k] = SynchronizedBatchNorm2d(planes, eps, momentum, affine) 
                    # init parameters 
                    Net._modules[k].running_mean = m.running_mean 
                   Net._modules[k].running_var = m.running_var 
                    if affine: 
                        Net._modules[k].weight = m.weight 
                        Net._modules[k].bias = m.bias  

but i got errors like below when i print the resulted Net

*** TypeError: must be str, not int

can anyone give some hint on why the above does not work and give some solutions (Though i know one solution would be redefine a network with batchnorm replaced by sync-batchnorm)?

1 Like

Reply to myself: I found that if the pretrained network changes from nn.Sequential to nn.ModuleList, and the code changes below:

for k, m in enumerate(Net):
                if type(m).__name__ == 'BatchNorm2d': 
                    planes = m.num_features 
                    eps = m.eps 
                    momentum = m.momentum 
                    affine = m.affine

                   Net[k] = SynchronizedBatchNorm2d(planes, eps, momentum, affine) 
                    # init parameters 
                    Net[k].running_mean = m.running_mean 
                   Net[k].running_var = m.running_var 
                    if affine: 
                        Net[k].weight = m.weight 
                        Net[k].bias = m.bias  

it will works fine for me. But i am wondering if there is some solution for network of type nn.Sequential

1 Like