Lstm in Seq2seq model

I try to implement lstm instead of gru in translation with a sequence to sequence network and attention tutorial
https://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html#sphx-glr-intermediate-seq2seq-translation-tutorial-py
But I face a problem
Is there any one use lstm in Seq2seq model based on this tutorial?
Thanks.

I use lstm to implement the seq2seq model.
Could u explain your problem in detail ?