Not be able to check modules whose name contains "."

Hi,
I would like to check one of modules stats in my model,However,I face the
SyntaxError when it’s name contains “.”.
Below is the case :

for name,param in model.named_buffers():
    print(name)

The result gives something like (I only take part of it):

cls_head.1.running_mean
cls_head.1.running_var
cls_head.1.num_batches_tracked

when I try to check the stats in module:

print(model.cls_head.1.running_mean)

It raises the error SyntaxError…How can I fix it ? Thanks!

Hi,

This is just a unique name that we generate to identify the buffers. But this should not be used to index the module this way. In particular, the 1 most likely come from a Sequential (or similar construct) that wraps a ModuleList or ModuleDict. You will need to check these and use the appropriate indexing method (like cls_head[1]) to access the module.

1 Like

I think i forgot telling that I just want to check if weight would vary during training.In the case below I am able to print out the weight.How can I do it ,Thanks!

class MODEL(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = Conv2d(3,32,kernel_size = 3,padding = 1)
        self.bn1 = BatchNorm2d(32)
        self.pooling1 = MaxPool2d(2,2)
        self.conv2 = Conv2d(32,64,kernel_size = 3,padding = 1)
        self.bn2 = BatchNorm2d(64)
        self.pooling2 = MaxPool2d(2,2)
        self.extra_conv1 = Conv2d(32,32,kernel_size=3,padding=1)
        for module in self.modules():
            if isinstance(module,nn.Conv2d):
                init.ones_(module.weight)
        
    def feature_extractor(self,x):
        out = self.extra_conv1(x)
        return out
    
    def forward(self,x):
        x = self.conv1(x)
        x = self.bn1(x)
        x = F.relu(x)
        x = self.feature_extractor(x)
        x = self.pooling1(x)
        x = self.conv2(x)
        x = self.bn2(x)
        x = F.relu(x)
        x = self.pooling2(x)
        return x
        
class MODEL2(nn.Module):
    def __init__(self,base):
        super().__init__()
        self.base = base
        self.conv1 = Conv2d(64,32,kernel_size = 3,padding = 1)
        self.bn1 = BatchNorm2d(32)
        self.pooling1 = MaxPool2d(2,2)
        self.conv2 = Conv2d(32,16,kernel_size = 3,padding = 1)
        self.bn2 = BatchNorm2d(16)
        self.pooling2 = MaxPool2d(2,2)

    def forward(self,x):
        x = self.base(x)
        x= self.conv1(x)
        x = self.bn1(x)
        x = F.relu(x)
        return x

if __name__=="__main__":
    base = MODEL()
    model = MODEL2(base)

    for name,param in model.named_parameters():
        print(name)
    print(model.base.extra_conv1.weight)

You can simply use print(param). That is the weight.

Sorry for late reply ,What if I only want to trace one of layer’s weight during training? Actually,I have already worked it out based on the method from your first helpful suggestion.
But the reason why I failed still not clear,as you can see ,from my last reply ,I am able to access the weight from model.named_parameters(),So my first guess is failure was caused by the name containing “.”? Thanks!!

What if I only want to trace one of layer’s weight during training?

The optimizer change the parameters inplace. So if you just hold on to param, it will be updated as training goes :slight_smile:

Your original code fails as attribute names cannot start with numbers. So foo.1 is invalid in python and raises a Syntax error.

Okay, I was trying to check if I freeze the batch norm layers correctly,Thanks for your explanations and patience :slight_smile:

1 Like