is that because the parameter reference in my optimizer?
if that’s the case what do you suggest to do to be able to change a layer and still be able to train the model?
If you do swap out weights of a trained model you’d need to update the parameter list within the optimizer.
I’m not 100% about the use of the detach function perhaps you’d be better just copying it directly across with model.x_layer.weight = model.x_layer.weight.copy()?
my model keeps updating without any problem. i mean i don’t understand exactly if i should update my optimizer or it is not needed.
This is a federated learning scenario in which i update my models a few epochs and then i want to aggregate them and feed the aggregated model back to them.