```
class RNN(nn.Module):
def __init__(self, input_size, hidden_size, output_size):
super(RNN, self).__init__()
self.hidden_size = hidden_size
self.i2h = nn.Linear(input_size, hidden_size)
self.h2h = nn.Linear(hidden_size, hidden_size)
self.h2o = nn.Linear(hidden_size, output_size)
self.hidden = torch.zeros(1, self.hidden_size)
self.softmax = nn.LogSoftmax(dim=1)
def forward(self, input):
self.hidden = F.tanh(self.i2h(input) + self.h2h(self.hidden))
output = self.h2o(self.hidden)
output = self.softmax(output)
return output
rnn = RNN(n_letters, n_hidden, n_categories)
temp_layer = nn.Linear(n_categories*2, n_categories)
for iter in range(1, n_iters + 1):
category, line, category_tensor, line_tensor = randomTrainingExample()
rnn.zero_grad()
outputs = []
for i in range(0,4): # T = 4 for BPTT
output = forward_step(line_tensor, i)
outputs.append(output)
result = torch.cat(outputs, dim=-1)
result_logit = temp_layer(result)
loss = criterion(result_logit, category_tensor)
loss.backward()
optimizer.step()
```

This is a simple RNN example I created to reproduce an issue I encountered in another task, mimicking BPTT. The behavior may slightly differ from a real RNN because I defined `self.hidden`

in the init function to recreate my problem. For simplicity, I omitted the dataloader part.

The issue is that when I run it, I encounter the following error:

RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

I suspect this might be because `self.hidden`

is being overwritten inside the `forward`

function. However, the code works fine for the first couple of iterations but throws the above error around the third iteration.

Also, I thought the computational graph gets re-initialized in every iteration, so I donâ€™t understand why itâ€™s trying to reference something again.