How to add nodes to pretrained model input?

Good day,

Question as in the title, how can I add new input nodes to a pretrained model?

An example as you can see in the image below. Let’s say I have trained ModelA and I would like to add new input nodes to pretrained ModelA as in ModelB.

Example code.

import torch

class ModelA(torch.nn.Module):
    def __init__(self):
        self.fc1 = torch.nn.Linear(4,3)
        self.fc2 = torch.nn.Linear(3,1)

    def forward(self, x):
         return self.fc2(self.fc1(x))
 
class ModelB(torch.nn.Module):
     pass
     # how do I contruct ModelB?

Thanks in advance.

I guess there is a better way to do it and would eventually be updated in the thread, but as far as I think adding a Linear layer in front of the input node of the pre-trained model should do the trick.

class ModelB(torch.nn.Module):
    def __init__(self):
        self.fc0 = torch.nn.Linear(new_shape,4)
        self.pretrained = ModelA()

    def forward(self, x):
         return self.pretrained(self.fc0(x))

Thanks for the reply @ariG23498

But wouldn’t that be adding a new layer, instead of adding new nodes to the pretrained input layer? Maybe can you explain more on it?

I haven’t seen a problem where I had to change the input node of a pretrained model. And come to think of it, if we apply a Linear infront of the pretrained model, we do not affect the model architecture, as we are eventually learning a linear equation.

I see. After googling for awhile, I thought the solution might be something like this
(How to add new nodes at the last layer (fully connected layer)? - #4 by MariosOreo),
where we pad the old model input weights to the shape of the new model input weights with random values and then load_state_dict.

or

maybe replacing the first linear layer of the new model with new input shape, and then assign its weights with pretrained weights like so (How to transfer the pretrained weights for a standard ResNet50 to a 4-channel - #2 by ptrblck)

Can you have a look? and let me know your thoughts. Thank you very much.

I think the second option should be the best.

1 Like