How to register hook to only part of weights

Hi

I am new to Python and do not very familiar with the lambda expression. I know I can register a hook to a weight by doing

h = model.classifier.weight.register_hook(lambda grad: grad * 0.)

But, is there a way to register a hook to only part of the weight. For example, to grad[0].

Thank you.

Hi,

No you cannot, but you can return apply your hook to just part of the gradient:

def hook_fn(grad):
    new_grad = grad.clone() # remember that hooks should not modify their argument
    new_grad[0].zero_()
    return new_grad

h = model.classifier.weight.register_hook(hook_fn)

Hi @albanD, I have seen your answers and wonder if there is any faster option of modifying the gradients. I tried to implement your code, but when using Hooks, my GPU is stopped, and the time to do training or de-inference is much slower than without Hooks. Is there any way of paralysing the code assuming that I want to check all elements within the gradient tensor?