# How to properly unsort unpacked sequences?

UPDATE: ok, I think I found a solution, would doing it like this (see example minimal code below) be correct?

I am trying to use packed sequences for my model, but since I have more than one input with sequences, I cannot simply sort the outputs: the inputs may both require the batch to get sorted in different ways.

So the only solution I can see here is to “unsort” the unpacked sequences once I have sent them through the LSTM.

The LSTM returns (out,(hend,cend)) where out is a packed sequence. When I unpack this, I get my sequences sorted by length.

I still have my indices from sorting, but all methods I tried to use those for “unsorting” break my gradients chain or make PyTorch complain.
If I use the method based on `newvar=sortedvar.gather(dim,unsort_idxs)` then I get the error message “save_for_backward can only save input or output tensors, but argument 0 doesn’t satisfy this condition”.
UPDATE: I think I have found a much simpler solution now (see below) would this be the correct way to do it?

Here is some minimal code to illustrate what I am trying to do:

``````import torch
from torch.autograd import Variable as V
# make it easier to create the sequences of vectors
lstm1 = torch.nn.LSTM(3,5,1)  # lst takes 3-dimensional inputs
batch=V(torch.LongTensor([[1,2,0,0,0,0],[3,0,0,0,0,0],[2,4,5,2,3,1],[4,1,2,2,0,0]]))
e_batch = embs1(batch)   # get the batch of sequences of embeddings
# Note: the batch is of shape batchsize,maxseq,3 so we need to use batch_first later
# these are my sequence lengths
lens = [2,1,6,4]
lens_sorted,idx = torch.IntTensor(lens).sort(0, descending=True)
# sort the embeddings batch by lengths
e_batch_sorted = e_batch[idx]
# create the packed sequences
# get the output from the lstm
(out,(hout,cout))=lstm1(packed)