Should backward hook be registered before forward pass?

I want to do the forward pass first, then repeating
register backward hook => backward propagation => clear gradient and remove hook=>register another backward hook
to use several backward hook functions, those should not be registered simultaneously.

However, it seems the backward hook function is handled in calling forward function, not the backward (https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/module.py#L484), so the backward hook registered after the forward pass doesn’t work.

Is there any possible trick for it?

The module backward hook should be called after instantiating the module but likely only once and only. Note that module backward hooks are not terribly useful currently and a recent attempt to fix them was abandoned.
The tensor backward hooks on the other hand need to be added after every forward, but can be added at anytime before the backward.

Best regards

Thomas

Thanks for the answer!
Umm… could you explain why they call backward hook as broken? Because anyway the hook gives me the information of gradients well now, which is its original purpose…

The backward hook as is is essentially the same as a backward hook on the output variable. If you have a non-trivial module (i.e. one which performs more than one operation) then you won’t get input gradients of the module, but those of the last operation in the module.

Best regards

Thomas

Thanks for detailed explanation!!