Changing variable size or type with register_hook

I’d like to change the value of the gradient with register_hook. This could mean changing the size or type of the parameter.

That is, I’d like to perform

def encode(name):
    def hook(var):
        return {'name': name, 'grad': var.grad}
    return hook
def decode(hook_output):
    return hook_output['grad']

for name, param in model.named_paramers():
    param.register_hook(encode(name))

# later; psuedo-code
for hook_out in hook_outputs:
    grad = decode(hook_output)

but this throws an error at python_hook.cpp#L147.

I can put in a PR to skip that check. The most natural API would be to have a check keyword argument in register_hook that defaults to True.

(I’ve filed pytorch#2787 on github but we should also talk about it here)

i think the consequences of removing that check are much greater, especially if the JIT is enabled.
I’ll discuss this with the core devs tomorrow and write some feedback on your github issue / here.

Alban responded at https://github.com/pytorch/pytorch/issues/2787
Let’s continue our discussion there.