How to access parameters using model's attributes' name

I am using for loop to modify the parameters in the model. I use named_parameters to check the names of the attributes and using for loop to record them.

weight_data = []
weight_key =[]
bias_key = []

for key,value in net.named_parameters():
if (torch.tensor(value.size()).size()==torch.Size([3])):
elif …:

What I want to do is modify the parameters in weight_data, I now I can use: = [] to do that, but I need to use for loop.
So how can I use keys in weight_key to do that? for example:
net.‘weight_key[0]’.data = […] (which is not right, just need too use the value in ‘weight_key’)

I would appreciate your help!

You could use getattr to get the submodules using their names.
Here is a small example:

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.fc1 = nn.Linear(1, 1)
    def forward(self, x):
        x = self.fc1(x)
        return x

model = MyModel()

keys = []
for name, value in model.named_parameters():

with torch.no_grad():
    getattr(model, keys[0].split('.')[0]).weight.fill_(0.)  # split key to only get 'fc1'