Access and overwrite ALL the weights (automatically, in a loop)

Hi, I’m looking for a way of accessing all the weights in a model in an automatic way (i.e. without manually resorting to the name of each layer) so that I can overwrite them.

I understand that I can see them all by doing:

for param in model.parameters():

but what I actually need to do is take that param and change it so that the param will have the desired value also inside the model (hence allowing me to see at inference time how the change is affecting accuracy). Thanks in advance

If I understand correctly you can use

for name, param in model.named_parameters():
   # Just an example
   if 'weight' in name:
         param.data == torch.zeros_like(param)
   elif 'bias' in name:
         param.data == torch.zeros_like(param) 

If you do this the param will reflect the changes back without adding a step to the gradient tape.

1 Like

Ok thank you very much, yes this immediately works. I had found this other solution first Overwrite parameters of model with new values but it’s nice you can avoid using state_dict().