Does my hook is correct?

I am using forward and backward hook in my pytorch densenet121 model.
I set requires_grad to False at the time of training.

for param in model.features.parameters():
    param.requires_grad = False

and I want to get gradient of last conv layer in Neural Network , for this i define hook like this:

model.features.denseblock4.denselayer16.conv2.register_backward_hook(h1)

but this is giving me empty list. But is i set requires_grad = True for the last layer then i get the gradient.

for param in model.features.denseblock4.denselayer16.conv2.parameters():
    param.requires_grad = True

I use gradient for gradCAM .

is this write way to doing or is there any other way to get gradient?

It’s expected that no gradients will be available in the hook, if you disable the gradient calculation.
register_backward_hook is also deprecated in favor of register_full_backward_hook, so you might want to change the usage.