I’d like to change the value of the gradient with register_hook
. This could mean changing the size or type of the parameter.
That is, I’d like to perform
def encode(name):
def hook(var):
return {'name': name, 'grad': var.grad}
return hook
def decode(hook_output):
return hook_output['grad']
for name, param in model.named_paramers():
param.register_hook(encode(name))
# later; psuedo-code
for hook_out in hook_outputs:
grad = decode(hook_output)
but this throws an error at python_hook.cpp#L147.
I can put in a PR to skip that check. The most natural API would be to have a check
keyword argument in register_hook
that defaults to True
.
(I’ve filed pytorch#2787 on github but we should also talk about it here)