Why the id of module, which is the parameter of the hook(module, input, output), is not equal that the module which calls register_forward_hook?

My code is:
`
def func_f(module, input, output):
print(’#####id : {}, module:{}, '.format(id(module), module))
self.all_fmaps[id(module)] = output.data.cpu()

        def func_b(module, grad_in, grad_out):
            self.all_grads[id(module)] = grad_out[0].cpu()

        for module in self.model.named_modules():
            # print("{} type is {}".format(module[0], type(module[1])))
            if  module[0] == 'module.base_model.layer4.2':
                print('In map id: {}'.format(id(module)))
                print('In map : {}'.format(module))
                self.m = module[1]
                module[1].register_forward_hook(func_f)
                module[1].register_backward_hook(func_b)

`

It print:
`
In map id: 140121648773960
In map : (‘module.base_model.layer4.2’, Bottleneck(
(conv1): Conv2d (2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True)
(conv_normal): Conv2d (512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn_normal): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True)
(conv4): Conv2d (512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn4): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True)
(relu): ReLU(inplace)
))

#####id : 140121605366560, module:Bottleneck(
(conv1): Conv2d (2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True)
(conv_normal): Conv2d (512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn_normal): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True)
(conv4): Conv2d (512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn4): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True)
(relu): ReLU(inplace)
),
`
Therefore, the modules are same, but with different ids, I don’t know why, have someone konw?

1 Like