How can I feed a batch into LSTM without reordering them by length

I’ new to Pytorch and I got some trouble…
I want to build a rank-model to judge the similiarity of question and its answers(include right answers and wrong answers). And I use LSTM to be an encoder. There are two LSTMs in my model and they share the weight. So my model’s input are two sequence(question and answer). But if I use batch, reordering will disrupt the correspondence between questions and answers.
What should I do?

You can keep track of the sort indices and use that to recover correspondence.

Thanks! I Got into this trouble because I want to use pack_padded_sequence to pack the data. It’s a High-level function so that I can’t recover the correspondence. Finally I packed data by myself and added zeros manually to keep all sequence have the same length.