Hi everyone, I would like to discuss a few things regarding the register_forward_hook method.
I am trying to modify the output of an intermediate layer of a pre-trained model (for ex: ResNet50’s layer4[2].conv2) and pass the modified output from this layer further forward to get the final modified result. There is very less documentation on this and I am stuck on this for a long time now. Please help me out on this.
I am trying to assign a different variable to the output of this layer and then pass it on forward. Could someone please help with this?
I have tried to do this and have been able to do this.
Problem arises when I try to assign a different custom variable of the same shape. The custom variable is the output from another pre-trained model and I wanted to see the output on the main model that I am placing the hooks on. Can you please check if this is possible to achieve and how?
def hook_1(module, input, output):
modified_output = custom_operation(model_2, target_layer_name)
output[0] = modified_output
return output
hook_handle = model.layer4[2].conv2.register_forward_hook(hook)
output_modified = model(input_tensor)
# This output doesn't change even after the hook_handle is applied for the same input tensor
I can’t figure why it isn’t modifying the intermediate layer
I think the code snippet I wrote doesn’t call the forward function on the model3 and hence there is no change in the output of the pre-trained models 1 and 2.
This is an issue at my end I believe.
Please share your end of code if you are facing similar issues.