Transfer of weights between two different models


I have a question. I have two different models. I would like to transfer learned weights of model1 to a different model (differnt layers and structure- autoencoder), such that output of this 2nd network first iteration(epoch), is equal to the output of the learned model(model1).

I do not have a code, because I simply do not know how to even begin with. Is it possible to implement this? If yes, I appreciate response.


Assuming the used layers are equal in their parameter setup (i.e. they are using parameters and buffers in the same shape) you could either copy the data manually e.g. via:

with torch.no_grad():

or could manipulate the keys of the state_dict and load it.

Thanks. Both models are entirely different. Below is model1, the outptut of which I like to have on the first iteration of my 2nd model.
class Model(torch.nn.Module):
def init(self, initial, min_vel, max_vel):
self.min_vel = min_vel
self.max_vel = max_vel
self.model = torch.nn.Parameter(
torch.logit((initial - min_vel) /
(max_vel - min_vel))

def forward(self):
    return (torch.sigmoid(self.model) *
            (self.max_vel - self.min_vel) +

model = Model(v_init, 1000, 2500).to(device)

and model 2 is a deep convolutional Autoencoder

If your models are “entirely different” I assume their structure and layers do not match. In this case you might not be able to transfer weights between these unless you can create a mapping between layers. E.g. a 1x1 conv layer could be mapped to a linear layer, but of course not every layer is transferrable to any other module.