I’m trying to implement a model that works on the hidden states of the LSTM. What I’m trying to do is to use pack_padded_sequence to make my LSTM model learn better. The problem I have right now is that I cannot understand how this packing (and sorting) affects the hidden states generated by the model. Are they returned to the right order? or should I rearrange them accordingly?