Concatnation of h(t) of forward and h(0) of backward in Bidirection LSTM output

Hi all, I am confused about the output of LSTM when the bidirectional is True. Like:

char_rnn_out, char_hidden = self.char_lstm(char_embeds, char_hidden)

The document says the h_n in LSTM output like this:

h_n (num_layers * num_directions, batch, hidden_size): tensor containing the hidden state for t=seq_len

I am wondering if I set the bidirectional as True, is h_n the concatenation of h_t of forward lstm and h_0 of backward lstm ?

I want to utilize the character lstm features which concatenates the hidden vector of last character in forward route and the hidden vector of first character in backward route. Can I use the char_rnn_out or char_hidden[0] as the character features directly?

Here is the structure: (from “Neural Architectures for Named Entity Recognition”)


After several trials, I find the output hidden gives what I want. The returned hidden is the concatenation of h(t) of forward and h(0) of backward lstm. That is exactly what I want.