I require to update grads of an intermediate tensor variable using the register_hook method. Since the variable isn’t a leaf-variable, I require to add the retain_grad() method to it after which, I can use the register_hook method to alter the grads.
score.retain_grad()
h = score.register_hook(lambda grad: grad * torch.FloatTensor(...))
This works perfectly fine during the training ( model.train() ) phase. However, it gives an error during the evaluation phase ( model.eval() ).
The error:
File "/home/envs/darthvader/lib/python3.6/site-packages/torch/tensor.py", line 198, in register_hook
raise RuntimeError("cannot register a hook on a tensor that "
RuntimeError: cannot register a hook on a tensor that doesn't require gradient
How could the model automatically disable the register_hook method when it in the eval() phase?
I think the simplest thing to do here is to guard the register_hook() call with if score.requires_grad:.
Also side note: you don’t need to call retain_grad() for the hook to work. Only if you want to be able to read the value in the .grad field on the Tensor.
The intermediate Tensor will have have their requires_grad=True in python. So you can use that as a hint if they are used in the computation of something that needs gradient computed.