Why does my model stops updating when i modify one of its layers

I was curious about why my model stops updating when i change one of its layers by the code below

model1.x_layer.weight = nn.Parameter(model2.x_layer.weight.detach().clone())

is that because the parameter reference in my optimizer?
if that’s the case what do you suggest to do to be able to change a layer and still be able to train the model?

If you do swap out weights of a trained model you’d need to update the parameter list within the optimizer.

I’m not 100% about the use of the detach function perhaps you’d be better just copying it directly across with model.x_layer.weight = model.x_layer.weight.copy()?

Hi Thanks for replying
Well when i modify my model1 weights with the code below

model1.x_layer.weight.data = model2.x_layer.weight.detach().clone()

my model keeps updating without any problem. i mean i don’t understand exactly if i should update my optimizer or it is not needed.
This is a federated learning scenario in which i update my models a few epochs and then i want to aggregate them and feed the aggregated model back to them.