LSTM Size mismatch

Hi, I am trying to implement an LSTM model to predict coronavirus cases and I got this error : RuntimeError: Expected hidden[0] size (2, 134, 14), got (2, 14, 14)

Here is some informations :
Len test : 47
Len validation : 37
Len train : 149
seq_len = 14
input_size = 1
output_size = 1
hidden_dim = 14
n_layers = 2
dropout = 0.4
num_epochs = 10

and the model is that :

class LSTM(nn.Module):
    def __init__(self, input_size, output_size, hidden_dim, n_layers,seq_len,dropout):
        super(LSTM, self).__init__()
        
        self.hidden_dim=hidden_dim
        
        
        self.lstm = nn.LSTM(input_size, hidden_dim, n_layers, batch_first=True, dropout = dropout)
        
        self.fc = nn.Linear(hidden_dim, output_size)
        
    def reset_hidden_state(self):
        self.hidden = (
        torch.zeros(n_layers, seq_len, hidden_dim),
        torch.zeros(n_layers, seq_len, hidden_dim))
        
    def forward(self, sequences):
        lstm_out, self.hidden = self.lstm(
      sequences.view(len(sequences), seq_len, -1),
      self.hidden)
        
        last_time_step = lstm_out.view(seq_len, len(sequences), self.n_hidden)[-1]
        y_pred = self.linear(last_time_step)
        return y_pred

What can I do to solve this problem ?

There are a few issues in your code:

  • the states should have the shape [num_layers * num_directions, batch_size, hidden_size], while your self.hidden tuple contains the seq_len.
  • if you want to permute the dimensions in sequences using sequences.view(len(sequences), seq_len, -1), note that you would interleave the data, so use sequences.permute instead.
  • the same applies for lstm_out.view. Swapping dimensions should be done with permute. Since you are using batch_first=True, you can instead index the output via lstm_out[:, -1, :] without any permutation.
  • self.linear is undefined and should be replaced with self.fc.

The docs provide also additional information about the expected shapes. :wink:

1 Like