Can i remove a layer from a pre-trained model while loading the model weights?

I think there is not an easy solution to do it automatically. :confused:
You could abuse the __getattr__ and __setattr__ methods and manipulate the modules with it.
E.g. here is a working (but not recommended) code snippet to add nn.Identity() modules in fron of all conv layers in a resnet:

model = models.resnet18()

# Get all layer names you would like to change
layers_to_change = []
for name, module in model.named_modules():
    if isinstance(module, nn.Conv2d):
        print('found ', name)
        layers_to_change.append(name)

# Iterate all layers to change
for layer_name in layers_to_change:
    # Check if name is nested
    *parent, child = layer_name.split('.')
    # Nested
    if len(parent) > 0:
        # Get parent modules
        m = model.__getattr__(parent[0])
        for p in parent[1:]:    
            m = m.__getattr__(p)
        # Get the conv layer
        orig_layer = m.__getattr__(child)
    else:
        m = model.__getattr__(child)
        orig_layer = copy.deepcopy(m) # deepcopy, otherwise you'll get an infinite recusrsion
    # Add your layer here
    m.__setattr__(child, nn.Sequential(
        nn.Identity(), orig_layer))

print(model)

While this seem to work for this model, I would recommend to create a new model class by deriving from the desired model as the base class and manipulate the layers inside the __init__ method.

2 Likes