The best way to concatenate final hidden of two networks in the bidirectional case

Hi!
What is the best way to concatenate final hidden of two networks in the bidirectional case?
I mean, for example, if we have 2 layers, batch_size = 5 and hidden_size = 10 BiRnn outputs tensors with shape(4,5,10) for c and h, but in my case, I need shape(2,5,20) because I will feed this to decoder.

Thanks!

You can use torch.stack([out1, out2], 2), see the doc here for more details.