How can I replace an intermediate layer in a pre-trained network?

I try to change the filter size of the layer with new weights values and then replace the existing one with this new one. I can access the layer by model._modules but cannot replace it. Ant help pls?

1 Like

I would love to know how to do the same thing.

My way to do is happens to be

model._modules['layer_name'] = th.nn.some_layer(...)

6 Likes

thanks for sharing! is there any other way?
might the set attribute “setattr” aid?

@erogol Did you find any solution to this question?

Following the responses.

I have managed to develop a code that can convert all the instances of a certain layer (up to a certain num_to_convert) in a generic model to a different layer. The code handles nested modules or blocks. I have modified it here for generality:

def convert_layers(model, num_to_convert, layer_type_old, layer_type_new, convert_weights=False):
    conversion_count = 0
    for name, module in reversed(model._modules.items()):
        if len(list(module.children())) > 0:
            # recurse
            model._modules[name], num_converted = convert_layers(model=module, num_to_convert=num_to_convert-conversion_count, layer_type_old, layer_type_new, convert_weights)
            conversion_count += num_converted

        if type(module) == layer_type_old and conversion_count < num_to_convert:
            layer_old = module
            layer_new = layer_type_new(module.in_channels, module.out_channels, module.kernel_size, module.stride,
                                             module.padding, module.dilation, module.groups,
                                             module.bias is not None, module.padding_mode) 

            if convert_weights == True:
                layer_new.weight = layer_old.weight
                layer_new.bias = layer_old.bias

            model._modules[name] = layer_new
            conversion_count += 1

    return model, conversion_count

To call the above function:

new_model = convert_layers(model, num_to_convert, layer_type_old=nn.Conv2d, layer_type_new=MyConv2d, convert_weights=False)
4 Likes

PS: I noticed after posting the above that this line:

            layer_new = layer_type_new(module.in_channels, module.out_channels, module.kernel_size, module.stride,
                                             module.padding, module.dilation, module.groups,
                                             module.bias is not None, module.padding_mode) c

will only work if the layer to replace is nn.Conv2d and the new layer has similar parameters to pass to its constructor. If someone has a more generic way to handle this that will be great to share.

2 Likes

This method does not delete the parameters of the old modules.
Is there any solution with deletion of the old modules’ parameters?