Pdb cannot debug into backward hooks

Here is my code.

import torch
v = torch.tensor([0., 0., 0.], requires_grad=True)
x = 1
def f(grad):
    global x
    x = 2
    return grad * 2
h = v.register_hook(f)  # double the gradient
v.backward(torch.tensor([1., 2., 3.]))
h.remove()
print(v.grad)

When I debug with pdb, I find that I cannot break in function f (I set a breakpoint inside f at the statement x = 2).

Does anyone know how to solve this?

Note: if I use pycharm, I can break into the function. But on the remote server, I would like to use pdb.