Using backward cannot calculate grad for input

I would say, try creating a new tensor before forward() call everytime.

sample_seq = sample_seq.detach().clone()
sample_seq.requires_grad = True
...forward()...

To me, it looks like you are involving the same variable sample_seq in further calculations and its difficult to trace . I am not sure if this is the issue. But you can try.

1 Like