How to trace a custom autograd function

Can anyone please help me workaround PyTorch’s limitation on tracing a custom auto-grad function? I found many discussions around the same topic in this community but it still doesn’t seem that there is any fix to it. Here are some discussions (issues) I looked through already:
https://github.com/pytorch/pytorch/pull/22329
Also issues:
22582, 69741, 32822, 75935 and 32822
I really need this for my research but I cannot figure out how to work around it. I read this too but couldn’t understand how and where to implement the solution (for one thing, I can’t find the csrc file in the installation path). I need to trace my model and save it in .pt format to load it again with Rust language. I have a custom auto-grad function. Torch.jit.trace(model, example) works just fine but when I try to save it by traced_model.save(path.pt) I get this same error as in the above discussions (back in 2019 haha).

Remove calls to Python functions before export. 
Did you forget to add @script or @script_method annotation? 
If this is a nn.ModuleList, add it to __constants__

@tom @ngimel