Dimension Error with bidirectional LSTM

Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules    /module.py", line 224, in __call__
    result = self.forward(*input, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/rnn.py", line 162, in forward
    output, hidden = func(input, self.all_weights, hx)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/_functions/rnn.py", line 351, in forward
    return func(input, *fargs, **fkwargs)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/_functions/rnn.py", line 244, in forward
    nexth, output = func(input, hidden, weight)
File "/usr/local/lib/python2.7/dist-packages/torch/nn/_functions/rnn.py", line 84, in forward
    hy, output = inner(input, hidden[l], weight[l])
IndexError: list index out of range

I’m trying to run the forward pass with a bidirectional lstm, but I’m getting this error. I have an embedding layer before this, so my input tensor size is 10 x 1 x 100 (10 word sequence, with batch-size 1, and 100 dimension embedding) and my hidden tensor is 1 x 1 x 25 (1 layer, batch-size 1, and hidden dimension of 50 divided by 2 because the lstm is bidirectional). How do I configure my inputs so that they are of the correct dimension? Thanks

my hidden tensor is 1 x 1 x 25

I think it should be

h_n (num_layers * num_directions, batch, hidden_size): tensor
containing the hidden state for t=seq_len

read the docs for more info

2 Likes

Yes, thanks, I think I figured it out!