The forward function of a multi-layer Elman RNN from tutorial has two errors

one is that while layer=2 x[t] should be substituted by h_t[layer-1],
and the other is that output.append(h_t[-1]) has overwriten the previous elements in the list, so here it should be rewritten as temp = h_t[-1].clone() and output.append(temp).