Concatenation of gru and lstm layers in pytorch

Hello Everyone, I am trying to concatenate the gru and lstm layers in pytorch. the example code is below. does anyone help?
self.rnn_encoder_gru = nn.GRU(din, dhid, bidirectional=True, batch_first=True)
self.rnn_encoder_lstm = nn.LSTM(din, dhid, bidirectional=True, batch_first=True)
self.rnn_encoder = concatenate (self.rnn_encoder_lstm, self.rnn_encoder_gru)

You cannot concatenate modules, but could concatenate their output tensors.
Once you’ve created the modules, you could pass a tensor to them and concatenate the output or hidden states afterwards.

Ok. Thanks. I got answer