How to set hooks on intermediate variables?

Hello all! I have a (hopefully) basic question regarding backward hooks.

I would like to set a hook on an intermediate variable in my model.

I try:

y.register_hook(lambda grad: print(grad.shape))

but that yields the error:

RuntimeError: cannot register a hook on a tensor that doesn't require gradient

So then I attempt to do:

y.requires_grad = True
y.register_hook(lambda grad: print(grad.shape))

but that yields the error

RuntimeError: you can only change requires_grad flags of leaf variables.

This seems circular? I see a lot of examples where register_hook is used on intermediate variables. What seems to be the error?

I am using Torch 1.6 on GPU

y.requires_grad_() would work on tensors (though the need of a gradient of “external” values is a bit suspicious…).

I normally do:
if x.requires_grad: x.register_hook(…)
to avoid failures during validation / evaluation.

Apologies, y was used as a generic place holder for an intermediate layer output (I probably should have used layer or some other variable name).

The extension to this question is applying different lambdas in the hook (to zero out some masked gradient values for instance). I will try the method you suggested, thanks!

From what I understand, x.retain_grad() should be preferred over using the register hook method. See this for a comparison of various approaches to saving gradients of intermediate variables: