How to register hook to only part of weights

Hi

I am new to Python and do not very familiar with the lambda expression. I know I can register a hook to a weight by doing

h = model.classifier.weight.register_hook(lambda grad: grad * 0.)

But, is there a way to register a hook to only part of the weight. For example, to grad[0].

Thank you.

Hi,

No you cannot, but you can return apply your hook to just part of the gradient:

def hook_fn(grad):
    new_grad = grad.clone() # remember that hooks should not modify their argument
    new_grad[0].zero_()
    return new_grad

h = model.classifier.weight.register_hook(hook_fn)