LSTMs on Padded Input Sequences


#1

I’m training an LSTM on sequences of variable sizes, padded to all be the same size. The LSTM trains successfully, but sampling from it yields a lot of of the padded character. I’ve seen a lot of implementations of padded sequences with pad_packed_sequence and the like, but I already have the sequences padded. Is there an example of how to have the LSTM only backpropagate over the actual sequence? I haven’t found anything in the docs for torch.nn.lstm?

Thanks


#2

If you have the lengths of the sequences, you can convert your padded tensor to a packed one with pack_padded_sequence