No you cannot, but you can return apply your hook to just part of the gradient:
def hook_fn(grad):
new_grad = grad.clone() # remember that hooks should not modify their argument
new_grad[0].zero_()
return new_grad
h = model.classifier.weight.register_hook(hook_fn)