This is just a unique name that we generate to identify the buffers. But this should not be used to index the module this way. In particular, the 1 most likely come from a Sequential (or similar construct) that wraps a ModuleList or ModuleDict. You will need to check these and use the appropriate indexing method (like cls_head[1]) to access the module.
I think i forgot telling that I just want to check if weight would vary during training.In the case below I am able to print out the weight.How can I do it ,Thanks!
class MODEL(nn.Module):
def __init__(self):
super().__init__()
self.conv1 = Conv2d(3,32,kernel_size = 3,padding = 1)
self.bn1 = BatchNorm2d(32)
self.pooling1 = MaxPool2d(2,2)
self.conv2 = Conv2d(32,64,kernel_size = 3,padding = 1)
self.bn2 = BatchNorm2d(64)
self.pooling2 = MaxPool2d(2,2)
self.extra_conv1 = Conv2d(32,32,kernel_size=3,padding=1)
for module in self.modules():
if isinstance(module,nn.Conv2d):
init.ones_(module.weight)
def feature_extractor(self,x):
out = self.extra_conv1(x)
return out
def forward(self,x):
x = self.conv1(x)
x = self.bn1(x)
x = F.relu(x)
x = self.feature_extractor(x)
x = self.pooling1(x)
x = self.conv2(x)
x = self.bn2(x)
x = F.relu(x)
x = self.pooling2(x)
return x
class MODEL2(nn.Module):
def __init__(self,base):
super().__init__()
self.base = base
self.conv1 = Conv2d(64,32,kernel_size = 3,padding = 1)
self.bn1 = BatchNorm2d(32)
self.pooling1 = MaxPool2d(2,2)
self.conv2 = Conv2d(32,16,kernel_size = 3,padding = 1)
self.bn2 = BatchNorm2d(16)
self.pooling2 = MaxPool2d(2,2)
def forward(self,x):
x = self.base(x)
x= self.conv1(x)
x = self.bn1(x)
x = F.relu(x)
return x
if __name__=="__main__":
base = MODEL()
model = MODEL2(base)
for name,param in model.named_parameters():
print(name)
print(model.base.extra_conv1.weight)
Sorry for late reply ,What if I only want to trace one of layer’s weight during training? Actually,I have already worked it out based on the method from your first helpful suggestion.
But the reason why I failed still not clear,as you can see ,from my last reply ,I am able to access the weight from model.named_parameters(),So my first guess is failure was caused by the name containing “.”? Thanks!!