Backpropagation for Riemannian Gradients


If I have a function f: M_1 -> M_2 and a function g: M_2 -> M_2, where M_1, M_2, and M_3 are Riemannian manifolds (in my case, Cartesian products of copies of hyperbolic space), if I want to backpropagate the gradient I need to calculate the Riemannian gradient of g from the Euclidean one before applying the chain rule on the composition. I know I could implement this using a torch.autograd.Function, but the function g that I want to use is rather complicated, and I would prefer not to write out the Euclidean gradient by hand. Is it possible to use an nn.Module inside a torch.autograd.Function to get the Euclidean gradient? Is there another way I could accomplish this?