Load part model with pretrained weights; other part reinit xavier

Currently I load a pretrained 7 layered network model using the command.

myNet=torch.load(‘path_to_model’)

Now I want to retrain the pretrained weights in the first 5 layer and re initialize the last two layers to xavier init or any initialisation.

How is possible for me to do so?

You could load the pretrained weights and just call the desired init method on the last two layers:

model = ...
model.load_state_dict(torch.load(...))  #  or your equivalent code using torch.load
with torch.no_grad():
    torch.nn.init.xavier_uniform_(model.fc1.weight)
    torch.nn.init.zeros_(model.fc1.bias)
    ...
3 Likes

okay got it. but can you tell me why we would need the torch.no_grad() here?

In the posted code snippet you won’t really need it.
It was just a security measure in case you performed some operations before loading the weights, so in case you initialize the parameters right after loading the model, you can just remove it.

1 Like