RuntimeError: derivative for _cudnn_rnn_backward is not implemented

I have no idea to solve this problem.
I impletemented torch.autograd.grad to get the gradient penalty loss, but this error just show again and again , did there any one has same problem?


1 Like

Unfortunately, double backward for cudnn rnn is not supported, there’s an upstream issue tracking this Recommended way of doing what you want to do is writing custom rnns with torch script