Same generations for every sample in batch with Seq2Seq model - What to do?

Hi there. I have a Seq2Seq model trained for paraphrase generation. Everything seems to be working fine, as the loss goes down, but the generations are not good. They are all the same for every entry in every batch. They differ between epochs as the weights are updated, but they keep being the same for every batch. I already tried training for several (~100) epochs. Do you have any suggestions for me to fix the problem?