In a bidirectional LSTM which columns correspond to the forward versus backward direction

I am using the nn.LSTM() layer with bidirectional set to True. I have a 3 layer network, so I get a hidden tensor that has is [6, BatchSize, input_size]. Now, the first dimension of hidden contains both the forward and backward direction of the LSTM for each of the 3 layers, but I was not clear on how that tensor is laid out. So does hidden[0, :, :] correspond to the first or bottom layer in the forward direction, and then hidden[1,:,:] correspond to the bottom layer in the backward direction?

I actually need to concatenate the hidden nodes from the forward and backward pass on each layer, so I imagine that would be something like the code below, but can someone correct me if I am wrong.

output, (hidden, cell) = self.lstm1(x)
new_hidden =[0::2, :. :], hidden[1::2, :, :], dim=2)

So new dimension should be of size [3, BatchSize, input_size].

From the LSTM docs:

h_n of shape (num_layers * num_directions, batch, hidden_size): tensor containing the hidden state for t = seq_len.
Like output, the layers can be separated using h_n.view(num_layers, num_directions, batch, hidden_size) and similarly for c_n.

You could thus use the mentioned view operation to separate the directions and index this tensor.