How to register hook twice?

Hi,

I am trying to visualize the attention maps of my model. There are two pathways that compute attention maps for the visual features (v1 and v2) of two different images, and the features (q) of a question but the attention mechanism is shared. So I have something like this in the forward method of my model:

v1 = self.attention_mechanism(v1, q) 
v2 = self.attention_mechanism(v2, q)

Because the attention mechanism is shared by both pathways, registering the hook for, say, attention_mechanism.conv2 will give me the attention map for v2. But I need to register it for both v1 and v2. What would be the way to do it? I tried several things but none of them worked.

Thank you

You could e.g. append the forward activations to a list for the activation (or use any other workflow to store both activations):

activation = {}
def get_activation(name):
    def hook(model, input, output):
        if name in activation:
            activation[name].append(output.detach())
        else:
            activation[name] = [output.detach()]
    return hook


lin = nn.Linear(1, 1)
lin.register_forward_hook(get_activation('lin'))

out = lin(torch.randn(1, 1))
out = lin(torch.randn(1, 1))

print(activation)
> {'lin': [tensor([[0.4515]]), tensor([[0.5995]])]}
1 Like