Overwrite network parameters

Hi everyone!

What is the current most efficient way to overwrite model parameters?

For example, I have a trained network with a given weight parameter W, and I have another parameter W_new (generated by external code, i.e. not part of the network), and I want to replace W with W_new.

W_new is already an nn.Parameter.

Cheers!
PiF

You could .copy_ the new parameter into the old one via:

with torch.no_grad():
    model.layer.weight.copy_(new_param)
1 Like