Different length sequences as batch input to LSTM


I have an LSTM that outputs a float number in each time-step. (It’s regression, not classification). I want my batches have sequences with different sizes. But I don’t know how to implement such thing in PyTorch.

Hi @farhad-bat,
Please have a look at this link. The pad sequence function will pad your sequences to make them all of the same length.

1 Like

You have two basic alternatives:


which oneis a beeter way though?

There’s probably not one best solution. For my work, I usually go for organizing my batches to avoid the hassle of padding and packing. But that’s just for basic research prototypes, and I’m the one looking at the code :slight_smile:

There are valid arguments against this approach. At least for practical purposes.

1 Like