I don't know bi-lstm output?

Today I learned something about lstm. here my question
As show in pytorch’s document.
here comes out a tuple.

for example:
here a sentence “I like eating apple” when put into a bilstm net
out: lstm_out,(h_n,cell)

I want to know what h_n is mean?it is connect vector of “ apple” and “I” or just "apple " in back lstm and forward lstm。

h_n will contain the last hidden state of the forward direction (“i like eating apple”) and the last hidden state if the backward direction (“apple eating like i”), just combined in one tensor. The hidden state for each direction might have multiple layers, hence the num_layers*num_directions in the shape of h_n.

Say you want to add the last hidden states of the forward and backward pass. This would look like this

h_n = h_n.view(num_layers, num_directions, batch, hidden_size) # to separate num_layers*num_directions
h_n = h_n[-1] to consider only the last layer
h_n = h_n[0] + h_n[1] # add hidden state of forward backward direction

Just as an example. Up to you how you want to further process the output

Thank you for your reply. i am very appreciate that you have help me resolve my question

And I I 'm confused that what kind of situation to get h_n or lstm_out? and h_0 should be random tensor or be a zero tensor ,Is there have any different?