Overwrite network parameters

Hi everyone!

What is the current most efficient way to overwrite model parameters?

For example, I have a trained network with a given weight parameter W, and I have another parameter W_new (generated by external code, i.e. not part of the network), and I want to replace W with W_new.

W_new is already an nn.Parameter.


You could .copy_ the new parameter into the old one via:

with torch.no_grad():
1 Like