BERT with a GRU layer, wrong dimensions

Hey all, struggling a bit with the input dimensions.

I have a BERT model (from huggingface) and want to add a GRU.
self.bert = BertModel.from_pretrained('bert-base-uncased')

I don’t understand the dimensions yet. Getting this error:
RuntimeError: input must have 3 dimensions, got 2.

The GRU is defined as follows.
self.gru = nn.GRU(input_size=768*2 , hidden_size=64, num_layers=2)

The output of the BERT is
torch.cat((outputs1.last_hidden_state[:,0], outputs1.last_hidden_state[:,max_q_len]), 1)
with torch.Size([32, 1536])

I can see that the output has 2 dimensions, but what’s the third one that GRU expects? According to the docs the input needs to be of shape seq_len, batch, input_size.

Would be glad if someone could help me out here. Thanks!