How is it possible that I can't create a deep copy although all my parameters are leafs?

I was playing around with deep copy here and eventually I got a very funny error were all my parameters are indeed leafs but I cannot do a deep copy regardless:

'''
'''
import torch
import torch.nn as nn

import copy
from collections import OrderedDict

model = nn.Sequential( OrderedDict( [ ('fc0', nn.Linear(3,1)) ] ) )
w = model.fc0.weight
print(f'w = {w.norm(2)}')
for i in range(5):
    w_new = w - 1000
print(f'w_new = {w_new.norm(2)}')
print(f'w_new is not a leaf?: {not(w_new.is_leaf)}')
assert not(w_new.is_leaf)
#model.fc0.weight = nn.Parameter( w_new )
setattr(model,'fc0.weight', w_new )
print(f'model.fc0.weight.norm(2) = {model.fc0.weight.norm(2)}')
print(model.fc0.weight.is_leaf)
print(model.fc0.bias.is_leaf)
model_copy = copy.deepcopy(model)

output:

w = 0.4955401122570038
w_new = 1732.4461669921875
w_new is not a leaf?: True
model.fc0.weight.norm(2) = 0.4955401122570038
True
True
.
.
.
__deepcopy__(self, memo)
     42     def __deepcopy__(self, memo):
     43         if not self.is_leaf:
---> 44             raise RuntimeError("Only Tensors created explicitly by the user "
     45                                "(graph leaves) support the deepcopy protocol at the moment")
     46         if id(self) in memo:

RuntimeError: Only Tensors created explicitly by the user (graph leaves) support the deepcopy protocol at the moment

which seems really strange. What is going on? Is this a bug in pytorch?

You are not using the setattr function correctly. If you want to change the weights from fc0, you need to do setattr(model.fc0,'weight', w_new )

What I want to avoid is exactly this model.fc0 because I want it to work for a model in general (so I have the string names of the layers)…is that possible?

Then you need two call to setattr, one for fc0 and one for weight.