Have no idea about this error.
_thnn_fused_gru_cell_backward can be found in pytorch/tools/autograd/derivatives.yaml, so I assumed it had been implemented. Am I right?
Are there any other possible reasons for this error to occur?
Thanks,
Yann
Have no idea about this error.
_thnn_fused_gru_cell_backward can be found in pytorch/tools/autograd/derivatives.yaml, so I assumed it had been implemented. Am I right?
Are there any other possible reasons for this error to occur?
Thanks,
Yann
No, it only occurs as the backward of _thnn_fused_gru_cell
.
If it had a line with - name: _thnn_fused_gru_cell_backward(...)
then it would be differentiable itself.
Best regards
Thomas
Hi Thomas,
Thanks for your reply.
Does this mean I have to implement the backward function?
Thanks,
Yann
Well, so we don’t have RNN double backwards at the moment.
If you wanted that, you would have to implement it yourself or implement an RNN cell yourself.
Best regards
Thomas
Do you mean I implement a cell as regular nn.module then it would automatically work?
Yes, exactly.
Best regards
Thomas