How to automatically disable register_hook when training model is in eval() phase in PyTorch?

I require to update grads of an intermediate tensor variable using the register_hook method. Since the variable isn’t a leaf-variable, I require to add the retain_grad() method to it after which, I can use the register_hook method to alter the grads.

score.retain_grad()
h = score.register_hook(lambda grad: grad * torch.FloatTensor(...))

This works perfectly fine during the training ( model.train() ) phase. However, it gives an error during the evaluation phase ( model.eval() ).

The error:

File "/home/envs/darthvader/lib/python3.6/site-packages/torch/tensor.py", line 198, in register_hook
    raise RuntimeError("cannot register a hook on a tensor that "
RuntimeError: cannot register a hook on a tensor that doesn't require gradient

How could the model automatically disable the register_hook method when it in the eval() phase?

Hi,

I think the simplest thing to do here is to guard the register_hook() call with if score.requires_grad:.

Also side note: you don’t need to call retain_grad() for the hook to work. Only if you want to be able to read the value in the .grad field on the Tensor.

1 Like

Hi @albanD,

Thanks for the answer ^^. You are correct, the register_hook works without retain_grad(). The code works perfectly fine during the eval() phase now.

I was confused about using if score.requires_grad: on the score variable as it is not a leaf-node variable, but an intermediate-node variable.

The intermediate Tensor will have have their requires_grad=True in python. So you can use that as a hint if they are used in the computation of something that needs gradient computed.

Yes, exactly. Somehow, it completely slipped my mind even when I checked the grad_fn attribute of the score variable in question.

1 Like