Adding layers after defining the model

I am trying to find a way to add layers to a model after without changing the definition of the model

For example let’s say i have this layer that I want to add after every ReLU layer i have in a ResNet

class SimpleCustomLayer(nn.Module):
    def __init__(self):

    def forward(self,x):
        return x

def add_module_after_layer(module:nn.Module, layer:nn.Module,model:nn.Module):
    for i in model.modules():
        if type(i) == type(layer):
            # Here i should add the module after the layer


model = ResNet()

add_module_after_layer(module= SimpleCustomLayer,
                       layer= torch.nn.ReLU,

Any idea how to add this layer after layer i ?

If your original model uses nn.ReLU modules for each of these activation functions (i.e. not the functional API via F.relu not reusing the same nn.ReLU module), you could replace each nn.ReLU with an nn.Sequential container containing the nn.ReLU and your SimpleCustomLayer.

1 Like
model = torchvision.models.resnet18(pretrained=False)

class SimpleCustomLayer(nn.Module):
    def __init__(self):

    def forward(self, x):
        return x

block = nn.Sequential(SimpleCustomLayer(), nn.ReLU())
for i, l in enumerate(model.modules()):
    if isinstance(l, nn.ReLU):
        print(f"Replacing at {i}")
        model[i] = block

This gives the following error which is very understandable.

Traceback (most recent call last):
  File "/path/", line 23, in <module>
    model[i] = block
TypeError: 'ResNet' object does not support item assignment

Does PyTorch provide any API for switching such layers ?

You would have to assign the new module to the attribute name e.g. via setattr. However, note that the resnet implementation reuses the nn.ReLU modules as seen in its definition so your approach won’t work.

Why would it not work if it’s reusing? in both nn.ReLU and my custom layer there’s no parameters to learn

I’ve assumed that your custom layer would have parameters, but you are right that it would work if that’s not the case.

Thank you for your help!