Is there a way to get grad_output for the linear layers of a cudnn GRU?

Hi,

I’m trying to implement KFAC ( Add KFAC optimizer · Issue #35801 · pytorch/pytorch · GitHub ) for GRUs, and I need to be access what would be grad_output for the linear layers in the GRU.

However since GRUs are implemented with cudnn (I could use a pytorch impelementation but cudnn is much faster) as a single unit, I’m not sure how to access that. I had a look at the C++ code of cudnn to see where I could hack it, but I’m not familiar with it and find it a bit obscure.

Does anyone have a suggestion?