The naming is a bit misleading as grad_in and grad_out are used in backward hooks.
In forward hooks the vanilla naming would just be input and output.
You are basically creating a function named hook_function with a specific signature which is expected by register_forward_hook.
register_forward_hook makes sure to call the function you’ve passed with two arguments, the input and output of the nn.Module you’ve registered it to.
This is done automatically, so you don’t actually see in your code where input and output is created.
The last line just tries to register the current selected_layer to hook_function. selected_layer has to be set beforehand or should have a default value otherwise.
No, they are just variables. It’s actually not grad_in and grad_out, but input and output in the forward function.
You could also name them a and b. The important fact is, that register_forward_hook needs a function with a signature of getting two arguments. The first argument is the input to this layer, the second its output.
self.conv_output just saves the activation of the first sample in the batch and self.selected_filter.
It’s a member of your class to visualize the activation. Later you may call my_class.conv_output to visualize the activation map.