Change conv layer after model creation

I want to modify the parameters of all nn.Conv2d layers in a model. Specifically, i want to change the kernel_size from (X,Y) to (X,3), padding from (X//2, Y//2) to (X//2,1), stride from (SX,SY) to (SX, 1), and of course the weight tensor to be modified accordingly.

I started doing something like:

for name, module in net.named_modules():
    if isinstance(module, nn.Conv2d):
        stride = list(module.stride)
        stride[1] = 1
        module.stride = tuple(stride)

Then I was about to change module.weight then i realised that there is a lot of initialisation that is taken care of in the constructor of the layer. So really, rather than modifying the internals of the layer, i should be reconstructing a new one fresh. How do i do this?

The reason why i want to do this is because i want to do something like:

import timm
net = timm.create_model("resnet50")
...adapt conv2d layers here...

wrap code in ``` to make it easier to read.

why dont you simply assign a new layer in the same for loop? It depends on the type of model. For sequential you can directly change like a list, check the code below. If you are using vision models you can use .features to accsss each layer and reassign it with a new one

import torch

model = torch.nn.Sequential()
_ = model.add_module('a', torch.nn.Linear(1,2))
_ = model.add_module('b', torch.nn.Linear(3,4))
model[0] = torch.nn.Linear(5,6)

output: Sequential(
  (a): Linear(in_features=5, out_features=6, bias=True)
  (b): Linear(in_features=3, out_features=4, bias=True)

Ideally I would like there to be no assumption about the model structure. Otherwise I have to write specific code for specific models.

Iterating through named_modules was the only way I found that could do this. Unfortunately you cannot modify modules in-place. At least not easily from the looks of it


If I understood your case correctly, this post might be able to help. The issue with the snippet is that it has not been extended to support hierarchical layers but it won’t be hard to extend.