Debugging `RuntimeError: dependency not found for torch::autograd::AccumulateGrad`

I’m getting this error when running torch.autograd.grad in the static backward function of torch.autograd.Function.

Since my codebase is too large, I won’t be posting it here. But as clearly stated, this runtime error should be coming from the C end, and I would like to get some advice on debugging it. Thanks!

I could be wrong, but this seems like a synchronization bug. I have searched everywhere on the web, and haven’t found any related issues.

Moreover, the crash type seems to be non-deterministic. I’m sometimes getting this runtime error, but at other times getting segfault.

Can you give more details on what you are doing? Do you have a code sample?