Forward_pre Hook in Quantized Model

I am trying to upate the the input to the linear layer in BERT transformer using forward_pre_hooks,

nn=model_dynamic_quantized.bert.encoder.layer[0].attention.self.key

def gethook(mask):
def forward_pre_hook(model_dynamic_quantized, input):
print(input)
input=input*0
return input

h=nn.register_forward_pre_hook(gethook(0))

classifier = pipeline(task=‘text-classification’, model=model_dynamic_quantized,
tokenizer=tokenizer)

results = classifier(“A stirring, funny and finally transporting re-imagining of Beauty and the Beast and 1930s horror films”)
print(results)

detach the hooks

h.remove()

I am getting this error

TypeError: Kernel.raw_input() takes from 1 to 2 positional arguments but 3 were given