Invalid shape error in pad_pack_sequence

I’m trying to implement a Pyramidal Bi-LSTM module. The forward function calls the following code

# input to pack_padded_sequence is torch.Size([618, 16, 512])
x = pack_padded_sequence(x, x_len, batch_first=False, enforce_sorted=False) 
out, _ = self.blstm(x) # out.data.size() = torch.Size([8632, 256])
outputs, output_lens = pad_packed_sequence(out, batch_first=False) # this errors out

The lstm executes fine but it throws the following error on padding back…


<ipython-input-10-0412449191e5> in forward(self, x, x_len)
     17     out, _ = self.blstm(x)
     18     print("pblstm output: ", out.data.size())
---> 19     outputs, output_lens = pad_packed_sequence(out, batch_first=False)
     20     # out = out.transpose(0, 1)
     21     # return out, hidden

/usr/local/lib/python3.6/dist-packages/torch/nn/utils/rnn.py in pad_packed_sequence(sequence, batch_first, padding_value, total_length)
    327         max_seq_length = total_length
    328     padded_output, lengths = _VF._pad_packed_sequence(
--> 329         sequence.data, sequence.batch_sizes, batch_first, padding_value, max_seq_length)
    330     unsorted_indices = sequence.unsorted_indices
    331     if unsorted_indices is not None:

RuntimeError: shape '[156, 8, 256]' is invalid for input of size 114688

I don’t understand where the shape [156, 8, 256] comes from and what does 114688 represent. I’d appreciate any help thanks.

Maybe you can check if your x and x_len is matching.
I’ve met this error today, it’s because my x is [64,50,300] , but the x_len has an element of 51 (bigger than 50).

4 Likes

Thanks a lot for your answer. It really helps me.