How to reshape (num_layers * num_directions, batch, hidden_size) to (batch,hidden_size*2) correctly

Hi everbody,

I have a variable whose shape is (1 * num_directions, batch, hidden_size). It is the h_t output of a bi-directional lstm. I would like to reshape it to (batch x 2*hidden_size).

More specifically, I want to concatenate the final hidden state of the forward lstm and backward lstm. Is it the correct way for doing it ? :

h_t_concatenated = torch.reshape(h_t,(batch,hidden_size*num_directions))

Or should I first permute the dimensions as follows ? :

h_t_new = h_t.permute(1,0,2)
h_t_concatenated = torch.reshape(h_t_new,(batch,hidden_size*num_directions))