LSTM process the variable-length sequence

Hi all,

I have a question about processing the variable-length sequence using pytorch LSTM.
When using torch.nn.utils.rnn.PackedSequence to process the variable-length sequence, we should sort sequences by length first.
However, how can I do if I do not want to sort sequences by length when processing the variable-length sequence in pytorch??


You could pass the ordered sequences to your RNN and reorder them afterwards to match your original ordering

Toy example:

A = torch.LongTensor([1,2,5,3,7,4])
sorted, reorder = torch.sort(A)
_, original_ordering = torch.sort(reorder)

sorted[original_ordering] will now be exactly the same as A.

1 Like

Thank you for your time.
I wonder that are there pytorch examples that do not need to sort sequences by length when inputting to LSTM.
Just like as follows.
a 0 0 0 0
a b 0 0 0
a b c 0 0
a b 0 0 0
a 0 0 0 0