Debugging F.softplus() for nan gradient

Hi,
In my multi-layer network, F.softplus(x) gives me nan gradient, and I want to know what x value & incoming gradient is causing it.
What is the best approach to debug?

Thanks!

Have you tried registering a backward hook to the gradient function to inspect the values?

Thanks!
I am currently trying the variable hook instead of the module hook, plus using lambda function to capture the input x. Seems to work.