Issues with implementation of Riemannian Adam on a hyperboloid with trainable curvature

Hi everyone,

I am trying to construct an Adam optimizer for the hyperboloid manifold. I am using the implementation from https://github.com/HazyResearch/hgcn/blob/master/optimizers/radam.py. And I am basically rewriting their Hyperbolic Graph Neural Networks so that I gain some understanding. I have basically managed to implement the models but I am stuck with implementing the optimizer. This is my implementation here https://github.com/arijitthegame/Geometric-Deep-Learning/blob/master/Hyperbolic_GCN%20.ipynb

But unlike the hgcn repo, I do not construct a manifold class and my trainable parameter curvature ā€˜cā€™ is a different parameter for each layer of the model. So the problem I am facing is to project the tensors back on the manifold during the optimization step, I need to know the curvature of that layer. And how to grab that parameter is not really clear to me.

I am sorry about this jumbled mess of thoughts in my head and would really appreciate any help.

Best,

Arijit.