Issue using ._parameters internal method

So, re the first part, when I run:

delattr(model._modules[key], param_key)
setattr(model._modules[key], param_key, updated)

print(model._modules[key]._parameters[param_key])

I get the error:

print(model._modules[key]._parameters[param_key])

KeyError: ‘weight’

where param_key is weight. The point being, when I loop over parameters using

for param_key in model._modules[key]._parameters:
    # module._modules[key]._parameters[param_key] = memo[p]  # old

    delattr(module, param_key)
    setattr(module, param_key, memo[p])

upon the second iteration I get the error:

for param_key in model._modules[key]._parameters:

RuntimeError: OrderedDict mutated during iteration

Re the second part, I tried using _stateless with the snippet you provided here, and I’m getting the following error:

for name, tensor in parameters_and_buffers.items():

AttributeError: ‘generator’ object has no attribute ‘items’

when I

print(params)

I get

<generator object Module.named_parameters at 0x7f98f8dc1450>

My model parameters include:

self.fc = nn.Linear(10, 10)
self.relu = nn.ReLU()
self.params_list = nn.ParameterList([self.fc1.weight, self.fc1.bias])