I’m working with document encoding. First I encode each sentence (unknown_nb_sentencesXmax_nb_words_Xword_emb) and as a second step, I have another encoder to encode these hidden sentence embeddings (unknown_nb_sentencesXsent_emb).
As my first tensor is already manually padded (this is why I’m using max_nb_words), I wanted to use nn.utils.rnn.pack_padded_sequence to use efficiently RNN but as prerequisite, it needs the tensor ordered in decreasing length. Therefore, I cannot order the sentence as order matters.
Is there another way to do it rather using the whole sequences inside a RNN ?