I was reading Translation with Seq2Sq network with attention here

I don’t understand why encoder hidden is re-initialised in each iteration?

Shouldn’t encoder_hidden to be initialised only once?

See the following code snippet:

```
def train(input_tensor, target_tensor, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion, max_length=MAX_LENGTH):
encoder_hidden = encoder.initHidden()
def trainIters():
for iter in range(1, n_iters + 1):
loss = train(input_tensor, target_tensor, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion)
```