I’m trying to apply a hook to a tensor which should require a gradient, but I get the error:
RuntimeError: cannot register a hook on a tensor that doesn’t require gradient
This happens on the line where I register the hook.
def forward(self, x):
c_outputs = [c(x) for c in self.children]
output = torch.exp(c_outputs[0])
print(c_outputs[0].requires_grad)
print(output.requires_grad)
if self.clip_grad:
output.register_hook(modify_grad)
return output
Output:
True
False
[...]
File /tree.py", line 251, in forward
output.register_hook(modify_grad)
File "xxx/anaconda3/envs/deep/lib/python3.8/site-packages/torch/_tensor.py", line 430, in register_hook
raise RuntimeError("cannot register a hook on a tensor that "
RuntimeError: cannot register a hook on a tensor that doesn't require gradient
However the print statements indicate that whatever I’m putting inside the exp operator requires a gradient while the output after applying the exp operator does not. I found similar behaviour for the repeat operator, but switching it out for the expand operator helped. Any idea how I can make the output require a gradient?