Use forward_pre_hook to modify nn.Module parameters

Every time before the forward() function of an nn.Module is called, I want to check and probably modify this nn.Module’s parameter, including weight and bias.

But the official source code of “register_forward_pre_hook” below doesn’t really say if this is achievable.

    def register_forward_pre_hook(self, hook):
        r"""Registers a forward pre-hook on the module.

        The hook will be called every time before :func:`forward` is invoked.
        It should have the following signature::

            hook(module, input) -> None or modified input

        The hook can modify the input. User can either return a tuple or a
        single modified value in the hook. We will wrap the value into a tuple
        if a single value is returned(unless that value is already a tuple).

        Returns:
            :class:`torch.utils.hooks.RemovableHandle`:
                a handle that can be used to remove the added hook by calling
                ``handle.remove()``
        """
        handle = hooks.RemovableHandle(self._forward_pre_hooks)
        self._forward_pre_hooks[handle.id] = hook
        return handle

So the question is simple, could a forward pre hook modify the module?

P.S. I’m not really sure if this is the same question raised here Add hookable weights · Issue #5790 · pytorch/pytorch · GitHub

Hi,

Yes, you can modify the Module in any way you want during that hook (the module is the first arg of the hook function).
Note though that if you swap out weight Tensors, it might not behavior properly with other elements that were tracking the old Tensors like optimizers.