Modify the feature(activation) maps

Is it possible to modify the activation maps when our model is in evaluation mode? e.g. set value of certain activation map to 0 but others keep unchanged. I only know we can indirectly turn off an activation by setting the corresponding weight matrix to zero, but if we want to make detailed modification on the activation map, it is impossible in this way.

1 Like

Hi @zeke_wang
You should be able to do it with forward pre hooks.

Let’s say you have the following model:

net = nn.Sequential(nn.Linear(16,4),
                    nn.Linear(4, 16))

If you want to modify the activation maps that goes to the second Linear, and only in eval mode, your hook will look like this:

def hook_fn(module, input)
    if not
        for i in input:
            i[mask] = 0

and to register it:

hook = net[1].register_forward_pre_hook(hook_fn)
1 Like

Thanks for your help. It is so useful to get familiar with hook in pytorch. I only treated it as a intermediate output reader in the past.
I consider the complete version will be:

def modify_input(idx, mask):
    def hook(module, input):
        if not
            input[idx] = mask
    return hook
hook = net[1].register_forward_pre_hook(modify_input(idx, mask))