There are no graph nodes that require computing gradients

Hi, I have implemented a rnn with customized middle and output layers.
When I call loss.backward(), I got the error message in the title.
Can anyone shed a light on the cause ?
I have include the model and train.py in this link below.

https://github.com/ShihanSu/sequence-pytorch

what is datahp? what does the iterator return?

I’m guessing that your problem is that you’re not casting the input and target data Variable type and therefore autograd can not compute the gradients… But again, I’m guessing! Didn’t spend too much time looking at your code.