LSTM input dimensions for batch size padding

I’m doing a sentiment analysis project with a large dataset of tweets. I didn’t want to pad my data to the largest tweet in the entire dataset, so I padded my data to the largest in each batch instead. But the dimensions across batches are obviously not the same so I can’t figure out what to do with my model’s input dimensions.

Are dynamic input dimensions even possible, or do I need to do either padding to the largest data point in the dataset or some other kind of padding?

Different sequence lengths across different batches are not a problem.

It only matters that the sequences in the same batch have the same length.

The model dimension is completely independent from the sequence length. Your LSTM will always go through the whole sequences in a batch, no matter the length. You should have noticed that nn.LSTM offers no parameter to set a sequence length.