Get the gradients from a hook

I hoped to read module gradients from register_backward_hook

 module.weight.grad

But it is either None or all zeros. What hook should I use for the gradients?
(I can get gradients w/o problem from the training loop).

This is my model and I tested on MNIST.

class M2(nn.Module):
    def __init__(self):
        super().__init__()        
        self.l1 = nn.Linear(28*28, 10)
        
    def forward(self, x):
        x = x.reshape(-1, 28*28)        
        x = F.relu(self.l1(x))        
        return x

Here is the full notebook:

I have the same issue. Could you find a solution?
I want to modify the gradient (module.weight.grad) based on the g_inp and g_out of a module.