hi,
How to remove a hook from a layer.
I am doing the below, but it’s incorrect.
Accessing the features of denselayer1 with hook1 function.
denselayer1 = []
def hook1(self, inputs, output):
denselayer1.append(output)
handle1 = densenet121.features.denseblock1.denselayer1.register_forward_hook(hook1)
_ = densenet121(random_input)
q=denselayer1[0] + torch.tensor(3)
handle1.remove()
Now, here I am removing the hook.
Then,
Accessing the features of denselayer2 with hook2 function.
denselayer2 = []
def hook2(self, inputs, output):
denselayer2.append(output)
handle2 = densenet121.features.denseblock1.denselayer2.register_forward_hook(hook2)
_ = densenet121(random_input)
Now, Accessing the features of denselayer2 with hook1 function.
densenet121.features.denseblock1.denselayer2.register_forward_hook(hook1)
Now, when I do
denselayer1[0]==denselayer2[0]
it comes False
.
what am I doing wrong?