Pooling over PackedSequence

Hi,

I have the following network:

input = torch.nn.utils.rnn.pack_padded_sequence(input,sequenceLengths,batch_first=True)
output, hidden = GRU(input,hiddenLayer)
output, seqLengths = torch.nn.utils.rnn.pad_packed_sequence(output,batch_first=True)

How do I do max pooling over output as sequence length is variable? Assuming output is of dimension N x L x hiddenLayerDim. N = batchSize, L=length of longest sequence.

I would like to do max pooling along the length (L) dim (=1).

I think you should use nn.functional.max_pool1d

But the sequence length is different across different slices of the batch. The values in the sequence may be negative, so I don’t want to pick up padded zeros as max while simply doing maxpool1d

One option would be to pad with float("-Inf") instead of padding with 0, so that the max operation will never return padding.

You might want to consider creating batches where all sequences within one batch have the same length. No need for padding or packing; see this older post. It’s simply convenient, and I use it all the time.