Hello,
I am trying to find a way to add layers to a model after without changing the definition of the model
For example let’s say i have this layer that I want to add after every ReLU layer i have in a ResNet
class SimpleCustomLayer(nn.Module):
def __init__(self):
pass
def forward(self,x):
log_tensor(x)
return x
def add_module_after_layer(module:nn.Module, layer:nn.Module,model:nn.Module):
for i in model.modules():
if type(i) == type(layer):
# Here i should add the module after the layer
model = ResNet()
add_module_after_layer(module= SimpleCustomLayer,
layer= torch.nn.ReLU,
model=model)
If your original model uses nn.ReLU modules for each of these activation functions (i.e. not the functional API via F.relu not reusing the same nn.ReLU module), you could replace each nn.ReLU with an nn.Sequential container containing the nn.ReLU and your SimpleCustomLayer.
model = torchvision.models.resnet18(pretrained=False)
class SimpleCustomLayer(nn.Module):
def __init__(self):
super(SimpleCustomLayer).__init__()
def forward(self, x):
return x
block = nn.Sequential(SimpleCustomLayer(), nn.ReLU())
for i, l in enumerate(model.modules()):
if isinstance(l, nn.ReLU):
print(f"Replacing at {i}")
model[i] = block
This gives the following error which is very understandable.
Traceback (most recent call last):
File "/path/", line 23, in <module>
model[i] = block
TypeError: 'ResNet' object does not support item assignment
Does PyTorch provide any API for switching such layers ?
You would have to assign the new module to the attribute name e.g. via setattr. However, note that the resnet implementation reuses the nn.ReLU modules as seen in its definition so your approach won’t work.