I am working on an encoder that uses LSTM
def init_hidden(self, batch): ''' used to initialize the encoder (LSTMs) with number of layers, batch_size and hidden layer dimensions :param batch: batch size :return: ''' return ( torch.zeros(self.num_layers, batch, self.h_dim).cuda(), torch.zeros(self.num_layers, batch, self.h_dim).cuda() )
this is the code to initialize the LSTM, what does the batch_size represent for LSTM ? is the number of LSTMs used in the encoder ? . It is from social gan algorithm.
Can someone please explain what it means?