Is there an inverse of rnn.pack_sequence?


When we feed sentences into LSTM

# Variable length input sequence 'a' (without pad)
# 10 sentences, embedding size 5
a = [torch.randn(random.randint(1, 4), 5), 10]
a = torch.nn.utils.rnn.pack_sequence(a, enforce_sorted=True)
# here `out` is packed_sequence..
out, _ = model.lstm(a)

To unpack out,
Pytorch has torch.nn.utils.rnn.pad_packed_sequence which is an inverse operation to pack_padded_sequence. It gives unpacked and also padded sequences…
sure we can easily remove pads…
but sometimes people want nice and simple way… like an inverse of pack_sequence.
is there anything like this?


(Arunav Shandilya) #2

instead of sorting the sequences : using torch.nn.utils.rnn.pad_packed_sequences
you can try sorted(key, ‘reverse’) for sequence sorting without changing it actually.
and you can pass it directly into the input model and no need to reverse it accordingly