Recurrent Attention Unit in GRU c++

Hello everyone,

I would like to make some modifications to the current python torch.GRU by adding an attention model within the cell. For this, I have made some research and it seems that I need to go all the way down to the C++ code where the GRU has been implemented with the mathematical expressions.

Any idea of how I could access this C++ model?

Cheers,

Alex