I’d like to keep track of an LSTM hidden state and at some point revert to an earlier state. I’m running into errors when I try to reuse the previous state. I’m brand new to pytorch and any help would be greatly appreciated!
There are two things I’ve tried so far. The first is to save off the output Variables and try to reuse them later. This leads to an error in backwards() since we are going over the graph again and it has been discarded already. This makes sense to me and retaining the graph seems to not be what I want.
The second is to get the Tensors out of the output Variables and create new Variables with those saved Tensors. This seems to break autograd, giving me a ‘there are no graph nodes that require computing gradients’ runtime error. I don’t understand why this is the case.
Thanks in advance!