Size of the hidden layer LSTM

how the number of neurons in the hidden layer affect the LSTM ?

1 Like

More neurons in the hidden layer means that you are learning more parameters for more ‘connections’. The LSTM has, for a single state, a structure just like an MLP except that there is a second parameter matrix for the ‘previous state’. So just like with an MLP, more neurons means you are projecting your data into a higher dimensional space, and thus able to model different shapes of data.

Ultimately, the effect does not seem to be extremely well understood, and to find a ‘good’ number takes experimentation. Here is a stack exchange on that topic: https://ai.stackexchange.com/questions/3156/how-to-select-number-of-hidden-layers-and-number-of-memory-cells-in-lstm

3 Likes